Generative AI and the Actionability of Using AI Today

Scaling AI Catie Grasso

When asked about the most impressive Large Language Model (LLM) based application they’ve seen built in Dataiku so far, the panelists of this Everyday AI New York session responded with image to text descriptions, ChatGPT for augmented information regarding architecture, and visual question and answering. The lightning panel, “Generative AI and the Actionability of Using AI Today,” was moderated by Kurt Muehmel (Strategic Advisor at Dataiku) and featured three of Dataiku’s very own Generative AI subject matter experts:

  • Jed Dougherty, VP Platform Strategy
  • Krishna Vadakattu, Senior Product Manager
  • Catalina Herrera, Field CDO

For more details on the hundreds of discussions we’ve had with customers about using Generative AI in the enterprise — including insights on the Generative AI capabilities built into the Dataiku platform — keep reading.

→ Go Further: Dataiku for Generative AI

Basic vs. Advanced LLM Stacks 

Given that it’s been about one year since ChatGPT put these topics at the top of every boardroom conversation, panelists were asked what they see emerging as the standard “LLM stack” in the enterprise. Answers included choosing between running LLMs on an API vs. locally (or some combination of the two), a vector store for Retrieval Augmented Generation (RAG), potentially on top of that you have some level of actual training or fine tuning, an interface for prompt manipulation and understanding, and a chat interface or way of applying these things to a given set of data. We’re also seeing some sort of tool for orchestration (connecting a few different models together) and, on the development stack side, there is exploration with prompt engineering tools and notebooks to track and evaluate. 

When more advanced usage is concerned, we’re still in early stages so even the advanced stacks are quite basic (i.e., classic NLP augmented with LLMs, such as sentiment analysis). However, despite possible simplicity on the technical side, more people are being empowered to work with data and AI in new ways. Further, advanced use cases are often just a collection of relatively simple models that are interacting with other technologies (i.e., the notion of agents and tools). Finally, a lot of people are searching for new use cases for Generative AI, but you don’t need to do that — inside existing classification or regression data science workflows, LLMs can be applied to any NLP workflow or text-based data in an organization, so that could be a good place to start as a quick win.

Where Should Organizations Start When It Comes to Generative AI?

Recommendations from panelists about where to start with Generative AI are:

  • Focus on simplicity. If you’re progressing something from traditional NLP (some knowledge worker task you were already familiar with), there’s a really good chance that there’s a specialized task-specific model that you can pick up and use right away and get value. A classic example here is sentiment analysis. There are also times when the output or expectation that you would like doesn’t quite match and this is where the chance to bring in prompt engineering comes in — an opportunity to tailor the outputs from something pretty good to precisely what you’re looking for. Going further, perhaps with specialized knowledge that you need to bring value from your workflow, that’s when you can explore RAG — the very straightforward way to enrich contextual information about your business and industry without fine tuning. Once you’ve got that specialized knowledge and are looking for something truly cutting edge, you can begin to experiment with using agents and workflows with fine tuning. Exhaust the simpler possibilities first before investing in these more complex techniques, because by the time that initial exploration is complete, the advanced side of the spectrum will probably be more accessible.
  • Experiment, experiment, experiment. See what models are best for your organization’s use case, empower more people to get involved, and take the time to learn about these new concepts.
  • You don’t need to go build a massive architecture stack to get started. Don’t be afraid to dive in, even if you start small! 

Be sure to watch the full video for more valuable insights such as the biggest misconceptions about LLMs or related apps and services, as well as key skills to hone to best take advantage of Generative AI.

You May Also Like

Stay Ahead of the Curve for GenAI Regulation in FSI

Read More

Taking the Wheel Back With Dataiku's Model Override Feature

Read More

Explainable AI in Practice (In Plain English!)

Read More

Democratizing Access to AI: SLB and Deloitte

Read More