Particularly in the age of Generative AI, companies are rarely at a loss for ideas about use cases that could improve their business. But having the technical resources or know-how to tackle these problems is a different story.
In this recap of a session from Everyday AI New York, get key highlights from Antoine Villedieu de Torcy (Sr. Solution Data Scientist at Dataiku) on how Dataiku solutions — off-the-shelf starter projects and assets for common use cases — can help accelerate your team’s speed to value and solve industry challenges in a fraction of the time.
3 Clear Patterns for Generative AI
This past year, we went from a simplified access of a new technology (ChatGPT) to a myriad of new questions such as, “How will Generative AI transform my business?” to “Should it change my priorities?” and “Can it cut down costs?” When it comes to putting Generative AI to work (and answering some of these pivotal questions), three clear patterns of use cases emerge: structure, generate, and answer. We will dive into each one of these a bit more below!
Use cases in the “structure” category involve accelerating the transformation of raw, unstructured data into structured data. Examples include accelerated classification, simplified sentiment analysis, and eased entity extraction (topics that could be done before Generative AI with standard NLP techniques and the introduction of LLMs greatly accelerates the time to value). So, will LLMs replace what we did before? The answer is a little bit more complex than just a pure replacement.
Before LLMs came into the picture, these tasks required a dedicated model for each kind of application, each language used, and so on. Therefore, they required a lot of NLP expertise and a lot of time. LLMs come pre-trained on a wide variety of use cases and languages, so the models can be used directly as Swiss army knives for NLP. There is a trade-off, though. LLMs are much larger, so they are inherently costlier than the previous options.
Dataiku’s Generative AI-powered Customer Satisfaction Reviews use case enables teams to structure entities from raw text. Teams can use LLMs to extract elements from reviews to include entity recognition metadata, down to sentiment analysis on specific product dimensions such as size, fit, quality, fabric, color, or other product-specific values. They will easily be able to derive insights through a self-service analytics application once the input is validated and added to a structured dataset.
“Generate” use cases turn (predictive) analytics into tailored content (i.e., dashboards, images, stories). Examples include customized messaging creation, analytics-based reports production, and eased self-service insights from predictive analytics. Traditional machine learning (ML) models are trained to make structured predictions, while LLMs are trained to generate content, which is more reinforcement that LLMs will not replace these models, but rather be used to unlock ML’s full impact. Traditional ML models can be used to inform and fuel LLMs, so that the LLM can learn in-context and generate strong content.
Dataiku’s LLM-Enhanced Next Best Offer use case aims to predict the best product recommendation system to craft dedicated marketing campaigns for each customer. Designed for banking professionals, the use case enables employees to quickly generate high-impact emails and integrate them into their customer engagement strategies. The use case is composable, meaning it is powered by customer data — existing predictive models built and running within Dataiku AND Generative AI.
Use cases in the “answer” category significantly simplify queries and leverage documents and other unstructured data. Examples include conversing with documents and getting answers from multiple sources, capitalizing on internal knowledge that is otherwise sitting on a shelf. Dataiku’s LLM-Enhanced ESG Document Intelligence use case is designed to help organizations garner and generate ESG insights from a large and complex corpus of documents in seconds. The use case employs an approach known as Retrieval Augmented Generation in order to answer questions from documents.
LLMs Can Accelerate AI Business Impact, Use Them!
To summarize, LLMs will indeed replace traditional ML on a number of data tasks, accelerating use case delivery in the NLP space. In other areas, they will augment ML’s business value, taking ML impact to the last mile, turning predictions as raw data into consumable insights and significantly reducing cumbersome content production tasks. They will unlock the potential of documents through simplified Q&A, for the benefit of numerous business processes. Overall, LLMs are a key step forward in full AI democratization and acceleration of AI impact, across functions and industries.