Skip to main content
Image graphic of big data bias
June 7, 2021

Taking a crack at the glass ceiling and other biases perpetuated by big data

Written By:

Predictive algorithms are an influential part of daily life, even though most people aren’t aware they exist. The formulas, derived from massive collections of information often referred to as “big data,” are used to determine things like insurance rates, mortgage and credit worthiness, or to influence decisions such as who is hired for an open position and how much that person’s salary will be.

While relying on just the facts might seem like a great way to make unbiased, equitable decisions, one major issue with big data is that the information used to develop predictive algorithms often reflects the assumptions and biases of those who collected it.

As a result, predictive algorithms can transmit historical prejudices or perpetuate existing gender and racial inequalities.

That’s why Kangwook Lee, a University of Wisconsin-Madison assistant professor of electrical and computer engineering, and Eunsil Oh, an assistant professor of sociology at UW-Madison, are examining the reproduction of social inequality in big data-derived predictive algorithms.

“Big data and artificial intelligence are supported by an ideology that has an almost unquestioning faith toward numbers,” says Oh, whose research primarily focuses on gender, work and family. “What we are talking about is context. These numbers don’t emerge from nowhere.”

Lee, who studies machine learning, has researched the societal implications of big data from a computer science perspective. “Businesses have been using these predictive algorithms for more than 10 years now. Even though they have realized the issues, many are still employing predictions or decision making that is biased,” says Lee. “The question we want to answer is how much impact this has made on society and how we can measure these impacts.”

Their interdisciplinary approach incorporates both sociology and computer science perspectives on bias in big data. In their research, Lee and Oh will draw on the limited number of research papers published on the issue before beginning to collect and analyze datasets.

They plan to look into two particular issues impacted by predictive algorithms: the gender wage gap in the labor market and racial disparities in determining bail in the criminal justice system. They hope to identify the ways in which existing hierarchies are reproduced through these algorithmic predictions.

 Eunsil Oh
Eunsil Oh

“The gender wage gap, for instance, can be reproduced by a type of statistical discrimination,” says Oh. “If you hire a woman and calculate her pay based on the average first year wage for women in the past, it’s going to be lower than for the first-year wages for men. The same is true for racial groups because there is group-based wage discrimination in the system. If you use an existing tool to calculate salaries, you’re likely to have a very unequal wage to start with.”

The researchers will examine the datasets and algorithms to identify how they perpetuate bias. Eventually, they hope to develop recommendations for removing or mitigating bias in data-driven decision making that they can disseminate to the data scientists developing predictive algorithms.

For Lee, the project is very different from his previous work on machine learning. “Most of my earlier research was basically doing complex math problems. I never really thought seriously about the implications to humans and society,” he says.

But after deepening his understanding of the potential impact of machine learning on the broader world, he’s much more conscious of the ramifications of his work. “The impact of these equations is very serious,” he says. “I feel like I need to make sure they make some positive changes to society.”

Oh agrees that computer scientists and engineers need to be more thoughtful about the impact of their work and hopes this multidisciplinary project will be a step forward in making data tools more equitable. “When you apply mathematical tools, you have to do it do it carefully,” she says. “Especially when it’s related to people’s lives.”

The project is one of 15 announced in April 2021 for funding under the Understanding and Reducing Inequalities Initiative at UW-Madison. The initiative is jointly sponsored by the Office of the Vice Chancellor for Research and Graduate Education and the Wisconsin Alumni Research Foundation.


Categories