Yeshi Milner: Fighting for a More Equitable and Inclusive World With Data

Jacqueline Kuo

When I started working my first analytics job after college, I thought that data would give us a complete view of our world, and shed light on truths we as humans would not have known. I was excited to learn from the data and use it to give a voice to underrepresented communities.

Screen Shot 2022-12-08 at 11.31.20 AMYeshimabeit “Yeshi” Milner would have laughed at my naivete and pointed out that, while data has the potential to “fight bias, build progressive movements, and promote civic engagement,” history tells us that data is more often used as “an instrument of oppression, reinforcing inequity and perpetuating injustice” (from the mission statement of Data for Black Lives). Milner is the founder of Data for Black Lives and has focused her entire career on using data to create concrete and measurable change in the lives of Black people. When I read about Milner on the History of Data Science website, I was moved and eager to share her vision in this blog. 

Data for Black Lives aims to tell stories of people of color who are disenfranchised by algorithm-driven decisions. Milner is bringing to light inequity in data access: how the wealthiest people (often white men), are the ones who own the data, decide what to do with the data, and implement the algorithms that dictate so many aspects of our day-to-day lives. With Data for Black Lives, Milner is creating a political home for data scientists who want to fight against this algorithmic injustice. 

Some of the work that Data for Black Lives is doing:

  • They started the campaign #NoMoreDataWeapons. Data Weapon refers to “any technological tool used to surveil, police, and criminalize Black and Brown communities.” We need to start seeing algorithms that persist systemic racism as real weapons that harm minority communities in order for change to occur. Data weapons persist across every aspect of our lives and examples include “race-based risk calculators in medicine grounded in eugenics that send Black people to an early grave and social media ad delivery algorithms that send rental housing ads to Black people but homeownership ads to white people.” 

  • They are working to ban facial recognition in retail stores. Retail stores have used this technology to alert security whenever someone previously suspected of a crime is in the store and have been implemented mostly in non-white an lower-income communities.

  • More on their work here.

    data

Working in the data science space for a few years now, my approach to data has completely changed. I realize now that the real learnings don’t come from trusting what the data tells you, but they come when you completely distrust your data, when you relentlessly and without mercy hammer questions onto the data, like questioning a guilty witness who refuses to speak, until you identify all of the inherent human prejudices and biases. (Shameless plug: Dataiku has many resources on Responsible AI and some awesome built-in features that help data professionals understand biases in their data and models – see more here).

Once we know what our data lacks, how do we use data to create a world that is more equitable and inclusive? Organizations like Data for Black Lives are at the heart of answering this question.