Scaling GenAI Initiatives: Insights From Aimpoint Digital

Use Cases & Projects, Scaling AI, Featured Marie Merveilleux du Vignaux

After opening his talk at the Dallas roadshow of the Dataiku 2023 Everyday AI Conferences with a comprehensive multi-year history of how interactive Generative AI technology came to be, Aaron McClendon, Head of AI at Aimpoint Digital, asked the question that would form the foundation of his talk. “What can I actually do with this thing?”

As a longtime Dataiku partner, Aimpoint Digital provides tailored, comprehensive data and analytics support for businesses and industries of all sizes. Because of this, they’re uniquely positioned to speak to the kinds of requests clients have about how they’d like to make use of data in their overall strategies.

“We see clients in a wide range of industries, but we can see requests typically fall into two main categories: structured data and unstructured data.” By “structured data,” McClendon is referring to queries on a dataset like a table or a data lake (e.g., a centralized repository of information). By “unstructured data,” he speaks of data like large PDFs, presentations, or information exchanged over email.

Generative AI Challenges: Cost, Latency, and Determinism

Cost

Information security, and the costs associated with it, is the concern that McClendon raised first. “When you’re deploying a model, there are a lot of ways to keep it secure. Without the proper controls in place, data can become open source. It can become problematic, especially in industries like healthcare and finance where there’s a lot of PII floating around.” He asked his audience to consider now only how to keep information secure, but to balance cost-effectiveness with security. “You can deploy as always-on and always available, but you’ll pay more for it. You need to make sure proper controls and agreements are in place to make sure the data is secure.”

deploying llms in production environment

Latency

In addition to making sure the solution is secure, he recommended reviewing the actual load that the models will be taking on. “Latency is another factor that needs to be taken into account. We’ve built some applications where we’ll have eight to nine thousand requests coming in at a time, and being able to handle those requests at scale is something to think about.” 

When implementing AI solutions at scale, it’s important to not only make careful note of how many users will be using those applications, but also what latency requirements they have. He gave a specific example around customer service representatives, a group of people who are typically given a wide range of information from disparate sources to answer customer questions. “They can scan through hundreds of pages of PDFs, talk to a supervisor, or blast a question into a model in a secure environment. In those real-time scenarios, you need low latency requirements so you can service those requests clearly.”

Determinism

McClendon asked a thought-provoking question. “What if two users ask the same question? Even if it’s the same intent, you might get two different answers.” Models can be configured so that if the same question comes in, the same answer comes out, but intent is a more delicate matter. If two users intend to ask the same question but are instead similar instead of exactly the same, the answers might be completely different. He provided guidance here, too. “There are different ways to set up the parameters,” he said, “you can fine-tune the models.”

Using Dataiku to Build Models to Use Unstructured Data

McClendon then presented a real use case. “We’ve deployed a number of these in Dataiku environments,” he began, displaying an image of a table with various Ford vehicle parameters, from engine fuel type, mileage, and warranty tiers. “If someone calls in to a CSR and says, “I have this vehicle. How much will it cost to get this policy?” This document is just one of a 300-page PDF document. If someone’s new and they don’t know where to look for this information, models can accelerate this data discovery.”

ford blue advantage upgrade plans

He then transitioned to another Dataiku implementation built using Python Dash. Users can, “come and connect to databases directly with unstructured or structured data, and ingest information. It can also allow for dynamic file uploads.” In essence what his team created is a way for users to make use of continuously updated data sets, thereby continually updating the models, building in accuracy over time.

Quality Data Fuels Quality Models

In closing, McClendon emphasized the importance of quality data.


The way that you provide information to your models dictates the quality of the response that comes back.

He also mentioned that industries need to understand and work hard to avoid model hallucination (when AI models create incorrect or misleading information). He made sure that his audience understood that the information they feed into Generative AI models is accurate and useful already, but also that the system is continually optimized through fine-tuning and optimization over time.

You May Also Like

Adoption of Generative AI in Retail & CPG: The Time Is Now

Read More

How Will AI Shape the Future of Life Sciences Organizations?

Read More

Alteryx to Dataiku: Path to Automation

Read More

Maximize Your Data Potential Beyond Spreadsheets

Read More