Nav: Home

Using artificial intelligence to detect discrimination

July 10, 2019

UNIVERSITY PARK, Pa.--A new artificial intelligence (AI) tool for detecting unfair discrimination--such as on the basis of race or gender--has been created by researchers at Penn State and Columbia University.

Preventing unfair treatment of individuals on the basis of race, gender or ethnicity, for example, been a long-standing concern of civilized societies. However, detecting such discrimination resulting from decisions, whether by human decision makers or automated AI systems, can be extremely challenging. This challenge is further exacerbated by the wide adoption of AI systems to automate decisions in many domains--including policing, consumer finance, higher education and business.

"Artificial intelligence systems--such as those involved in selecting candidates for a job or for admission to a university--are trained on large amounts of data," said Vasant Honavar, Professor and Edward Frymoyer Chair of Information Sciences and Technology, Penn State. "But if these data are biased, they can affect the recommendations of AI systems."

For example, he said, if a company historically has never hired a woman for a particular type of job, then an AI system trained on this historical data will not recommend a woman for a new job.

"There's nothing wrong with the machine learning algorithm itself," said Honavar. "It's doing what it's supposed to do, which is to identify good job candidates based on certain desirable characteristics. But since it was trained on historical, biased data it has the potential to make unfair recommendations."

The team created an AI tool for detecting discrimination with respect to a protected attribute, such as race or gender, by human decision makers or AI systems that is based on the concept of causality in which one thing--a cause--causes another thing--an effect.

"For example, the question, 'Is there gender-based discrimination in salaries?' can be reframed as, 'Does gender have a causal effect on salary?,' or in other words, 'Would a woman be paid more if she was a man?' said Aria Khademi, graduate student in information sciences and technology, Penn State.

Since it is not possible to directly know the answer to such a hypothetical question, the team's tool uses sophisticated counterfactual inference algorithms to arrive at a best guess.

"For instance," said Khademi, "one intuitive way of arriving at a best guess as to what a fair salary would be for a female employee is to find a male employee who is similar to the woman with respect to qualifications, productivity and experience. We can minimize gender-based discrimination in salary if we ensure that similar men and women receive similar salaries."

The researchers tested their method using various types of available data, such as income data from the U.S. Census Bureau to determine whether there is gender-based discrimination in salaries. They also tested their method using the New York City Police Department's stop-and-frisk program data to determine whether there is discrimination against people of color in arrests made after stops. The results appeared in May in Proceedings of The Web Conference 2019.

"We analyzed an adult income data set containing salary, demographic and employment-related information for close to 50,000 individuals," said Honavar. "We found evidence of gender-based discrimination in salary. Specifically, we found that the odds of a woman having a salary greater than $50,000 per year is only one-third that for a man. This would suggest that employers should look for and correct, when appropriate, gender bias in salaries."

Although the team's analysis of the New York stop-and-frisk dataset--which contains demographic and other information about drivers stopped by the New York City police force--revealed evidence of possible racial bias against Hispanics and African American individuals, it found no evidence of discrimination against them on average as a group.

"You cannot correct for a problem if you don't know that the problem exists," said Honavar. "To avoid discrimination on the basis of race, gender or other attributes you need effective tools for detecting discrimination. Our tool can help with that."

Honavar added that as data-driven artificial intelligence systems increasingly determine how businesses target advertisements to consumers, how police departments monitor individuals or groups for criminal activity, how banks decide who gets a loan, who employers decide to hire, and how colleges and universities decide who gets admitted or receives financial aid, there is an urgent need for tools such as the one he and his colleagues developed.

"Our tool," he said, "can help ensure that such systems do not become instruments of discrimination, barriers to equality, threats to social justice and sources of unfairness."
-end-
Other authors on the paper include Sanghack Lee, associate research scientist at Columbia University and former graduate student in information sciences and technology, Penn State, and David Foley, graduate student in informatics, Penn State.

The National Institutes of Health and National Science Foundation supported this research.

EDITORS: Dr. Honavar is available at vhonavar@ist.psu.edu or 814-865-3141. Aria Khademi is available at khademi@ist.psu.edu.

Penn State

Related Discrimination Articles:

When kids face discrimination, their mothers' health may suffer
A new study is the first to suggest that children's exposure to discrimination can harm their mothers' health.
Racial discrimination in mortgage market persistent over last four decades
A new Northwestern University analysis finds that racial disparities in the mortgage market suggest that discrimination in loan denial and cost has not declined much over the previous 30 to 40 years, yet discrimination in the housing market has decreased during the same time period.
Successful alcohol, drug recovery hampered by discrimination
Even after resolving a problem with alcohol and other drugs, adults in recovery report experiencing both minor or 'micro' forms of discrimination such as personal slights, and major or 'macro' discrimination such as violation of their personal rights.
Sexual minorities continue to face discrimination, despite increasing support
Despite increasing support for the rights of people in the LGBTQ+ community, discrimination remains a critical and ongoing issue for this population, according to researchers.
Fathers may protect their LGB kids from health effects of discrimination
Lesbian, gay, and bisexual individuals who report being discriminated against but who feel close to their fathers have lower levels of C-reactive protein -- a measure of inflammation and cardiovascular risk -- than those without support from their fathers, finds a new study from researchers at NYU College of Global Public Health.
Uncovering the roots of discrimination toward immigrants
Immigrants are often encouraged to assimilate into their new culture as a way of reducing conflict with their host societies, to appear less threatening to the culture and national identity of the host population.
Using artificial intelligence to detect discrimination
A new artificial intelligence (AI) tool for detecting unfair discrimination -- such as on the basis of race or gender -- has been created by researchers at Penn State and Columbia University.
Evidence of hiring discrimination against nonwhite groups in 9 countries examined
A new meta-analysis on hiring discrimination by Northwestern University sociologist Lincoln Quillian and his colleagues finds evidence of pervasive hiring discrimination against all nonwhite groups in all nine countries they examined.
Perceived discrimination associated with well-being in adults with poor vision
This study of nearly 7,700 men and women 50 or older in England looked at how common perceived discrimination was among those with visual impairment and how that was associated with emotional well-being.
Discrimination against older people needs attention, study says
Ever cracked a joke about old people? It might seem funny, but in a world where the population aged 60 or over is growing faster than all younger age groups, ageism is no laughing matter, says a University of Alberta researcher.
More Discrimination News and Discrimination Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Teaching For Better Humans 2.0
More than test scores or good grades–what do kids need for the future? This hour, TED speakers explore how to help children grow into better humans, both during and after this time of crisis. Guests include educators Richard Culatta and Liz Kleinrock, psychologist Thomas Curran, and writer Jacqueline Woodson.
Now Playing: Science for the People

#556 The Power of Friendship
It's 2020 and times are tough. Maybe some of us are learning about social distancing the hard way. Maybe we just are all a little anxious. No matter what, we could probably use a friend. But what is a friend, exactly? And why do we need them so much? This week host Bethany Brookshire speaks with Lydia Denworth, author of the new book "Friendship: The Evolution, Biology, and Extraordinary Power of Life's Fundamental Bond". This episode is hosted by Bethany Brookshire, science writer from Science News.
Now Playing: Radiolab

Space
One of the most consistent questions we get at the show is from parents who want to know which episodes are kid-friendly and which aren't. So today, we're releasing a separate feed, Radiolab for Kids. To kick it off, we're rerunning an all-time favorite episode: Space. In the 60's, space exploration was an American obsession. This hour, we chart the path from romance to increasing cynicism. We begin with Ann Druyan, widow of Carl Sagan, with a story about the Voyager expedition, true love, and a golden record that travels through space. And astrophysicist Neil de Grasse Tyson explains the Coepernican Principle, and just how insignificant we are. Support Radiolab today at Radiolab.org/donate.