FeministData

Data science gives us numbers we can use to definitevly demonstrate the work that still needs to be done to make the world more inclusive and safer for all

It is well known in the data science community that there is a bias issue with Big Data. It is incredibly susectible to pre-existing biases and prejudices in our society - this is easy to understand as we humans have produced this data and so it reflects the worst part of our opinions back at us.

The issue comes along when we try to build ML models and other decision making software, which are intended to have frequent usage in our day to day life. While the purpose of these machines is to improve the efficiency of decisions which have to be made repeatedly, if they are fed data sets with the same bias as our society has then they will reflect this back at us.

Caroline Criado Perez

Feminist author, journalist and activist. Perez's best selling novel 'Invisible Women' shines a light on the lack of data that is collected and studied about women, due to 'male' being seen as a defule 'normal' human body.

Therefore women aren't included in the data that is used to make important decisions in all areas of life from the medical industry to designing public transport.

Consequently:

“They say women are too complicated to measure, or we haven’t collected data on them in the past so we can’t start now because there is no comparable data. That’s not good enough when you’re talking about half the world.”

Read More

Joy Buolamwini

Buolawmwini's Project known as 'Gender Shades' revealed how IBM, Microsoft and Face++'s gender classification products were significantly biased towards white men.

Buolamwini used a specially curated dataset which was designed to be equally split around sex and had subjects selected from 3 African countries and 3 European countries to test these products on.

These were the results:

"Fighting for Algorithmic Justice is my calling"

Read More

Data Feminism

Lauren Klein and Catherian D'Ignazio's book Data Feminism talks about how challegning power can help mobilize data science and push back against unequal power structures

They argue it's important to consider who you are trying to persuade with your data; if you want to show there is a bias or inequality in your community then it's not your neighbours you need to convince that the inequality exists but those in positions of power and dominant groups who through their background and influence bear some level of responsibility for helping fix this inequality.

To move from data ethics to data justice, we should first:

Because they can identify a source of problems in technical systems

Because this helps us acknowledge "structural power differentials" and work towards removing their influence.

"Machine learning algorithms don't just predict the past; they also reflect current social inequities"

Statistics hold power

Stat

Description of stat

See more
Stat

Description of stat

See more
Stat

Description of stat

See more

These women who have tirelessly worked to fight for a more equal world in the data-sphere demonstrate how data science has helped to positvely impact society by showing in numbers how much further we have to go to make society a fairer more inclusive place

Through the research these women have done, we can identify specific areas of life we know we need to take maticulous care in when we try to use data to answer specific questions

This includes facial recognition software, the criminal justice system and healthcare but examples can be found in many other areas of life.

These numbers give us the tools we need to enact direct and positive change for all.



Author Details

Theodora Dowglass

GitHub Page