Our Role in Fighting Racism and Inequality

Dataiku Company Florian Douetteau

Instead of being another tragic event in a long list of injustices against the Black community, George Floyd’s death has sparked a tsunami of anger, protest, reflection, and resolve for meaningful change across the world. This spark has made it clear that change must come from each of us: as individuals, as global citizens, as companies, and as Dataiku. The events of the past few weeks have galvanized our resolve to not only deeply question the way we operate but to make long-term changes that go beyond talk.

i-cant-breathe on a poster

We’ve tried to build a community of people beyond their job title with some intrinsic qualities, such as benevolence and respect. We’ve worked to build communities — both internally and externally — that don’t tolerate any form of racism, and where everyone can feel safe. But we must go further.

I’ve waited to make a statement on behalf of Dataiku not because I don’t believe Black Lives Matter. On the contrary; I believe that Black Lives Matter is the only way to feel. I also believe that white privilege is a reality of our society, and that unconscious bias is a reality of our brain.

At Dataiku, collectively, we didn't want to pile on another statement or talk about change without walking the walk. We took the time as a team and individuals to listen to each other, open up dialogue, and foster some ideas and plans we could commit to as a collective. Here is a preview of the first of these commitments. 

Changing Internal Processes

Part of any sustained, long-term solution to change is looking internally at our own processes:

  • Moving forward, Dataiku will audit and rethink the way we hire with a strong focus on increasing the diversity of our organization. Our goal is to become representative of society as it is, in the various countries we operate in. 
  • We will audit and make changes to our promotions structure to remove discrimination (conscious or unconscious). 
  • We will also raise the standard to which we hold our vendors and suppliers to make sure we don't work with companies whose business practices exacerbate racial inequalities.
Measuring progress on these initiatives is non-trivial; the plan for doing so is to have diverse candidates as part of the interview pool for 90% of our job positions. We hope to reach this goal six months.

Supporting - Both Monetary and Mentorship

We have a responsibility to provide education and mentoring for the future Black and minority data science leaders of the world. Through our ikig.ai initiative, we will create a mentorship and internship camp for junior high school students from underprivileged areas.

Likewise, grassroots movements often need more than just goodwill. So we’re putting our money behind meaningful causes such as Black Lives Matter, Black Girls Code, and Blacks in Technology. Each Dataiker’s donation to a non-profit working on Diversity, Equity, Inclusion (DEI) — overcoming racism and supporting local community action — will be matched up to $100.

Growing the Positive Use of AI

Examples of bias, discrimination, and racism in AI systems are abundant, which means the number of cases that don’t make it to the public eye are even greater. Much of the harm doesn’t come from bad intentions but from ignorance and mistakes that can be mitigated through tooling and education. With thousands of global users and hundreds of the world’s largest organizations building AI systems on our platform, it’s not lost on me the influence we have to facilitate positive change in the era of data.

Features like subpopulation analysis in Dataiku allow users to compute a table of various statistics to compare outcomes across defined subpopulations — including race. Individual prediction explanations introduced in Dataiku 7 also provide increased transparency and opportunity for analysis around bias. We’ve always said despite providing these tools, deciding whether a difference constitutes a significant finding with respect to fairness is left to the user; though we’ll continue to develop product features for Responsible AI, we need to do more.

In the short- to medium-term, we’re committing to developing and delivering Responsible AI training for customers, going in-depth on practical approaches for users (biased data, bad models, etc.) as well as touching higher-level topics on bias for executives. Our strategic advisory practice will work with customers to build pipelines and AI systems that fully integrate Responsible AI practices. We will also continue to address the topic of bias in AI in our content (e.g., articles like Explaining Bias in Your Data and at EGG Conferences worldwide).

Checking on Progress

Of course, transforming those topics and plans into something real will require resolve and energy. The real danger is to commit to something that will disappear faster than a change in the news cycle. We will meet five months from now to measure our progress (and hopefully, the progress of the world around us). We hope by putting this statement out there that our employees, users, customers, partners, and community will hold us to it.

You May Also Like

Execs Back to School With the Dataiku GenAI Field Trip

Read More

AI-for-Good: Dataiku’s Global Impact on NGO Missions

Read More

Dataiku Named a Gartner Magic Quadrant Leader 3 Times Running

Read More