Key Foundations for Achieving EU AI Act Readiness

Scaling AI David Talaga, Jacob Beswick

With the EU AI Act now published in the Official Journal and the Act coming into force on August 1, 2024, ensuring your organization is ready for compliance is more crucial than ever. 

In the second webinar of our Dataiku AI Governance web series, Jacob Beswick, Director of AI Governance Solutions at Dataiku, presents the essential foundations for achieving EU AI Act readiness. This blog aims to inform and engage readers on the critical steps necessary for compliance and explains how Dataiku can support your journey to full readiness.

Unpacking Review

If you're new to our series, here's a brief recap of our first webinar. The EU AI Act aims to promote the use of human-centric and trustworthy AI, ensuring high levels of health, safety, and fundamental rights protection. It introduces a risk-based tier system for AI applications, ranging from prohibited to high-risk, limited-risk, and minimal-risk categories, each with specific compliance requirements.

One critical concept covered previously is the EU AI Act's extraterritorial reach, affecting organizations within and outside the EU that either operate or sell AI systems in the EU. We also explored the implications for general-purpose AI models, the responsibilities of model providers, and the stringent penalties for non-compliance, including fines of up to €35 million or 7% of annual global turnover.

We did not cover new AI literacy obligations, which speak to the requirement of providers and deployers of AI systems to ensure that there is “sufficient and appropriate level of AI literacy” with respect to their employees who are developing and using these systems.

Why Readiness Matters

Readiness under the EU AI Act is about more than just compliance — it's about embedding a proactive approach to AI Governance within your organization that supports future EU AI Act compliance. Achieving readiness helps in several key areas:

  1. Reducing Compliance Costs: Organizations can avoid the costly rush of last-minute compliance efforts by planning and navigating the regulatory landscape early.
  2. Enhancing Understanding: Building a thorough understanding of the Act and its requirements across the organization helps in seamless implementation.
  3. Minimizing Risks: Proactively managing AI risks ensures that your AI systems are trustworthy and aligned with regulatory expectations.

Becoming ready for the EU AI Act requires two types of foundations, organizational and technical, which we will detail in the following sections.

key pillars for eu ai act readiness

Organizational Foundations

To establish a robust foundation for EU AI Act readiness, organizations must focus on several key areas:

  1. Regulatory Understanding: Organizations must develop a deep understanding of the EU AI Act. This ranges from basic awareness to comprehensive knowledge of the Act and its future compliance steps. Although no organization has perfected this yet, the goal should be to increase understanding across all levels of the organization steadily.
  2. Leadership and Sponsorship: Effective leadership is crucial for readiness. Leadership should be aware of the EU AI Act and prioritize it by assigning a dedicated sponsor (although this may look different for a given organization, including but not limited to a group-structure such as a board), developing a readiness strategy which is complemented by a roll-out plan to ensure things like organizational buy-in and implementation. This leadership support is essential for driving the readiness agenda forward.
  3. Clear Responsibilities: Clearly defined roles and responsibilities are vital. At the operational level, teams should know their specific duties related to the EU AI Act. This clarity prevents siloed efforts and ensures everyone works towards common compliance goals.
  4. AI Governance Framework: Organizations should establish a robust AI Governance framework that sets the rules and processes for managing AI systems. This framework should align with the organization's priorities and the requirements of the EU AI Act.

Technical Foundations

Technical foundations for EU AI Act readiness involve several practical considerations:

  1. AI System Awareness: Organizations need a comprehensive inventory of all AI systems in use, development, or procurement. This includes understanding each system's status, deployment plans, and associated risks. Using platforms like Dataiku can help maintain this inventory, ensuring all AI systems are tracked and managed effectively. Unidentified AI systems should be considered a risk to the organization.
  2. General-Purpose AI Models: With new obligations for general-purpose AI models, organizations must ensure they have relevant documentation from model providers and understand how these models are being used. Differentiating between formal and informal use — which is to say, the utilization of general-purpose AI models in AI systems as compared to staff usage of these models through interfaces provided by model providers for ad hoc and non-qualified purposes — of these models within the organization is essential.
  3. System Qualification: Starting the qualification exercise for AI systems as part of AI Act readiness prepares organizations for future compliance. This involves understanding the intended purpose of each AI system, associating it with the established risk tiers and logging its risk level,  and ensuring it operates as expected.
  4. Accountability and Ownership: Establishing clear lines of accountability is crucial as it facilitates organizational-level understanding of who will need to take responsibility for future compliance activities as these become formalized As part of AI Act readiness, consider associating lines of accountability at the business-area level (a senior manager responsible for a department or program) as this will provide a lens on exposure to risk as well as exposure to new compliance obligations in a way that can support decision-making about prioritization, for example. Complement this with lines of accountability at the team-level as this will set expectations on the ‘cost’ of AI systems operational today or planned for future. 

How Dataiku Can Help

Building the technical and organizational foundations for EU AI Act readiness requires not only understanding the challenges of organizational change but also having the flexibility to customize governance processes. Dataiku supports this by providing a customizable governance tool that structures and facilitates the systematical implementation of operational workflows of AI projects. 

This enables companies to accelerate their preparation for the EU AI Act by establishing the essential foundations, clear responsibilities, and streamlined approval workflows, ensuring compliance, auditability, and transparency of deployed models and projects.

eu ai act solution on dataiku screen

Putting It All Together

Achieving EU AI Act readiness is a multi-faceted endeavor that requires organizational and technical foundations. It demands proactive leadership, clear responsibilities, a robust AI Governance framework, comprehensive AI system awareness, detailed system qualification processes, and established lines of accountability.

By focusing on these key foundations, your organization can be well-prepared for the  EU AI Act, fostering a culture of trustworthy and human-centric AI that is compliance ready.

As we continue our AI Governance web series, we will explore these themes further, providing practical guidance and insights to help your organization navigate the path to compliance. Join us in our upcoming webinar, part three, where we will discuss building a system of trust and demonstrate how Dataiku Govern can support your AI Governance needs.

You May Also Like

The Ultimate Test of ChatGPT

Read More

Maximize GenAI Impact in 2025 With Strategy and Spend Tips

Read More

Maximizing Text Generation Techniques

Read More

Looking Ahead: AI Hurdles IT Leaders Need to Overcome in 2025

Read More