Best Practices for a Successful AI Center of Excellence

Dataiku Product Catie Grasso

When it comes to AI initiatives, everyone talks about building a so-called Center of Excellence (CoE). However, doing so successfully is easier said than done. It requires tight coordination and collaboration not only among the CoE itself, but with business teams around the organization.

This is a guide for CoEs on how to collaborate with the business to ensure success. For CoEs, the key to success lies in creating early wins, building demand, and then being able to scale out to more and more use cases (without growing the CoE staff exponentially, which necessarily means introducing efficiencies).

In order to do all of these things, there’s nothing more important than collaborating with business teams and units — they are critical to this picture of success. But with different teams juggling different priorities and directives, getting business units on board can be a challenge. Here are four best practices for CoEs to follow that will ensure business units are on their side:

1. Find the Right Team(s) to Start With

Working with the wrong teams to begin with can mean false starts and roadblocks along the way, whether because that team lacks the budget, the mindset, motivation, or the people. In other words, collaborating with the business will go more smoothly if the team wants — and has the resources — to collaborate mutually. 

The first team (or teams) are instrumental in getting AI initiatives off the ground, and there are many different strategies for finding the right partners-in-crime. The more boxes a team checks, the better the candidate they might be:

Find the Right Team(s) to Start With

2. Create a Structure of Support and Enablement

Business units will want to work with the CoE if there is adequate support and enablement that makes it easy for them to understand how to get started as well as the resources they will need to commit upfront. Tips from Dataiku customers with successful CoEs, including Rabobank and GE Aviation, for creating structure that, in turn, generates demand to support the business include:

  • Setting formal objectives. There should be agreed-upon service level agreements (SLAs) with the business for project delivery as well as established measures of success. From the CoE perspective, this prevents AI projects from dragging on with continual feature creep. For the business, because they know what to expect (and when), they can devote the right resources — from budget to staff. Hear from Rabobank on how they accelerated their AI efforts through a CoE.
  • Gamifying adoption. When it makes sense, gamification can be a low-cost program that adds structure by encouraging individuals on the business side to enhance data quality. For example, a large, global oil and gas organization rolled out a points system such that each time someone completed training, tags a dataset, creates new documentation, etc., that person receives a certain number of points, creating a competitive spirit with a leaderboard and prizes.
  • Formalizing training. If members of the CoE have to onboard and handhold each individual new user, scaling can become inefficient quickly. More importantly, it can become frustrating for the business if it’s done poorly. Creating a community of people working on AI initiatives that are able to help each other (particularly with the support of early super users) can offer some relief to support demands on the CoE itself. GE Aviation went one step further with 100- 200- and 300-level courses to onboard end users to their self-serve data efforts plus a full-day executive training available to anyone that wants to take it.
  • Providing technology that allows for scaling. Reuse is the simple concept of avoiding rework in AI projects, from small details (like code snippets that can be shared to speed up data preparation) to the macro level (like ensuring two data scientists from different parts of the company aren’t working on the same project). Capitalization in Enterprise AI takes reuse to another level - it’s about sharing the cost incurred from an initial AI project (most commonly the cost of finding, cleaning, and preparing data) across other projects, resulting in many use cases for the price of one, so to speak. 

3. Generate Demand

Creating a support system can help ensure lasting adoption, but it won’t necessarily automatically create demand. One big factor in demand generation is around use cases. Ideally at the beginning of a CoE’s life, a pipeline would be seeded with carefully curated use cases that have both high business value and a high likelihood of success (otherwise known as “quick win” or “low hanging fruit” use cases). Having people from the business side transfer to the CoE to work on these types of use cases to get the group started can be very successful.

Additional strategies for generating demand include:
  • Being programmatic in the approach to evangelizing the CoE and the value of AI. This includes defining and communicating the CoE’s value proposition, being clear and outspoken about what the group can provide and what the goals are both short- and long-term.
  • Running AI ideation workshops to both help evangelize and find additional use cases from across lines of business that might be good candidates for the CoE to support.
  • Strongly supporting first use cases done by the lines of business. This may seem like a lot of hand holding at first, but it will give the business the confidence and resources to keep moving forward. Eventually, they will be able and willing to become more independent.
  • Leveraging and nurturing AI champions. Work hard to establish footholds in lines of business through the use of champions, who can continue to provide quality use cases and support.
  • Building communities around each profile to be enabled — for example, a community of analysts, data scientists, etc., for sharing ideas and best practices.

4. Promote

It’s often difficult to isolate the contribution of data alone to improvements, especially larger business outcomes (like higher profit margins, lower costs, etc.). The calculation is complicated because the value isn’t all in one number — it can be spread across multiple departments and teams. For these reasons, measuring ROI for data projects can end up being a data project in and of itself, which is often difficult to justify.

However, just because it’s hard doesn’t mean that it shouldn’t be a priority, and business value — not simply innovation — must remain the focus. Creating more business value over time is the goal. That means CoEs starting out need to choose a flagship project and communicate on the value to garner more support, and mature CoEs still need to do the work to not only quantify their impact, but make sure everyone at the organization understands it. If business units see success and value, they will be more willing to collaborate more or get started with the CoE if they haven’t already.

More established CoEs can evolve their quantification of value — think bigger picture than just ROI for specific use cases. This might take the format of benchmarking AI maturity, setting concrete goals for where the company would like to be on the AI maturity curve, and then re- assessing the progress on a quarterly basis. Bigger-picture value will encourage business units across the organization to get on board in hopes of moving the needle.

For example, Dataiku has developed a five-step Enterprise AI maturity model that multinational companies worldwide are using as a framework to measure and communicate value at a more macro level across the business. These best practices represent a robust framework for CoEs aiming to achieve successful AI integration within their organizations. Stay tuned for our next installment, where we will explore essential strategies designed to empower business units in optimizing the benefits of AI.

You May Also Like

Generative AI With Dataiku: What’s New and What’s Next

Read More

Modernizing Financial Workflows: Why Finance Teams Need to Evolve

Read More

Building a Modern AI Platform Strategy in the Era of Generative AI

Read More

The AI Governance Challenge: How to Foster Trust

Read More