The future of data and AI is constantly changing before our eyes. Staying ahead requires not only foresight but a clear understanding of where the industry is headed. At Dataiku’s recent Product Days event, Clément Sténac, Dataiku CTO and co-founder, shared his insights on the evolving landscape of data, analytics, and AI. This blog explores key takeaways, from revisiting our founding principles to unveiling innovations like Dataiku Answers, showing how Dataiku is empowering organizations to navigate change and harness the transformative power of analytics and AI.
Looking Back to Move Forward
To look ahead, we must reflect on where we started. Founded in 2012, Dataiku emerged during a time when data science was dubbed the “sexiest job of the 21st century.” However, the reality for organizations was challenging — data scientists were extremely hard to find and difficult to retain.
What began as a mission to enhance the productivity of data scientists has since grown into something much larger: enabling seamless collaboration across all roles in data and analytics. Today, Dataiku stands as the Universal AI Platform, empowering everyone — from domain experts to data engineers — to work together. We believe data work exists on a continuum, and success depends on collaboration at every step.
Change Is the Only Constant
The past 12 years have taught us one undeniable truth: Change is the only constant. From Hadoop’s prominence in 2012 to the rise of Spark, the migration to the cloud, and the emergence of deep learning, the landscape has transformed dramatically. Through these changes, Dataiku has adapted, helping organizations navigate new technologies and trends.
Generative AI (GenAI) is the latest chapter in this ongoing story. While its power and complexity are undeniable, it’s not a standalone solution. True value lies in integrating GenAI into existing data, analytics, and AI ecosystems — a challenge Dataiku addresses head-on. Our tools, including the Dataiku LLM Mesh, ensure organizations can adopt GenAI responsibly, securely, and at scale.
The LLM Mesh: Expanding the Possibilities of GenAI
To help organizations navigate the complexity of GenAI, Dataiku introduced the LLM Mesh last year — a common backbone for GenAI applications that promises to reshape how analytics and IT teams securely access GenAI models and services. The LLM Mesh enables organizations to efficiently build enterprise-grade applications while addressing concerns related to cost management, compliance, and technological dependencies. It also enables choice and flexibility among the growing number of models and providers.
Building and Scaling Enterprise-Grade GenAI Applications
GenAI applications are built on key components like prompts, tools, data, and models. These elements must work together, through code or visual frameworks, to deliver solutions that integrate into workflows or operate as user-facing tools.
One of the core principles of Dataiku is that we help you build applications and put them into production so they deliver real value, not just remain as experiments.
- Clément Sténac, CTO and co-founder of Dataiku
Scaling these applications goes beyond just building them. Organizations need to ensure that their applications deliver sustainable, real-world impact. Dataiku combines the LLM Mesh with advanced GenAI application-building tools to accelerate development, achieve compliance with evolving regulations, and embed governance-ready solutions for long-term success.
What’s New: Expanding the Possibilities With Dataiku
Dataiku Answers
Dataiku Answers is a secure, responsive chat interface fully integrated with the LLM Mesh. Designed to simplify and scale GenAI for enterprises, it empowers users to apply Retrieval-Augmented Generation (RAG) techniques to automatically extract precise, trusted information from organizational knowledge banks. Additionally, teams can use the application to directly query datasets — calculating averages, totals, and more — to gain instant, accurate insights.
This powerful solution can also serve as a more all-purpose productivity tool to help with ad hoc analyses or tasks. Users can simply upload documents or images, provide instructions, and instantly get answers or new generated content. Thanks to the LLM Mesh, data is protected and traced every step of the way, with embedded content moderation, PII screening, and caching capabilities.
💡Coming Soon: Unified Chat Portal: Streamline workflows with a centralized chat interface that leverages multiple AI agents (e.g., legal, sales) behind the scenes and automatically routes questions to the right AI agent for the job.LLM Guard Services
LLM Guard Services provide a comprehensive suite of tools designed to help organizations manage GenAI effectively and securely. These services address key concerns like cost management, quality assurance, and safety, offering the control needed to scale GenAI with confidence.
Cost Guard, a key component of LLM Guard Services, equips teams with detailed cost-tracking capabilities by user, project, and provider, ensuring complete transparency. Teams can set spending limits and monitor usage to avoid overruns, enabling organizations to avoid unexpected expenses.
💡Coming Soon: Proactive Blocking: Automatically enforce spending limits by blocking usage once budget thresholds are reached.LLM Evaluation
The LLM evaluation framework helps organizations ensure the quality and relevance of LLM responses, ensuring they meet enterprise standards. It provides tools to evaluate outputs using both standard and custom metrics and monitor them in production. This ensures results remain accurate, compliant, and relevant to organizational needs.
💡Coming Soon: User Feedback Integration: Incorporate real-time feedback to continuously refine outputs and enhance model performance.LLM Registry and Regulatory Readiness
As AI regulations continue to evolve, Dataiku provides AI governance tools that enhance accountability and compliance. The Dataiku LLM Registry tracks and documents LLM usage across the organization, while prebuilt solutions simplify adherence to frameworks such as the EU AI Act.
💡Coming Soon: LLM Suitability Controls: Define which LLMs can perform specific tasks (e.g., restrict queries involving Personally Identifiable Information (PII) or other sensitive data to self-hosted models), ensuring robust compliance and data protection.