The Daily Pennsylvanian is a student-run nonprofit.

Please support us by disabling your ad blocker on our site.

Incoming Penn professor Danaë Metaxa studied gender and racial biases in Google Image Search results. Credit: Kylie Cooper

Faculty experts in communications and computer science discussed how internet algorithms perpetuate gender and racial biases on Friday afternoon at a virtual event.

The event, titled “Annenberg Conversations on Gender: From Algorithm Audits to Accountability,”  covered topics ranging from the implicit biases in Google Image Search results to the need for diversity in professional communities. Stevens University professor and Penn Integrates Knowledge professor Duncan Watts led a conversation with incoming Computer and Information Science assistant professor Danaë Metaxa.

Friday’s event was part of Penn's “Annenberg Conversations on Gender” series, an initiative that will explore topics and issues relating to gender through discussions with academics, activists, artists, and others throughout the 2021-2022 academic year. 

The speakers discussed Metaxa’s research on the racial and gender biases perpetuated by internet algorithms and their effects. In a recent project, Metaxa conducted an “algorithm audit,” which they described as “trying to understand the internal processes of some system to which you don’t have full access.” 

Metaxa conducted an audit of Google Image Search results, studying the gender and racial biases present in searches for different professional occupations. 

They compared Google Image Search results to data describing the racial and gender diversity of various fields — including doctors, lawyers, and real estate agents —  from the United States Bureau of Labor Statistics. They found women and people of color are underrepresented in search results compared to their proportions in the workforce at a rate of about seven to 15%.

The project also measured the effects of these biases on the searcher. Metaxa created synthetic Google Images Search pages with different proportions of women and people of color to study how searchers respond to different results. 

Researchers asked participants questions about various occupational fields, including their interest level in the field and how welcome they would feel if they joined the field. 

Metaxa said that showing more women in search results increases women’s interest in joining that particular field, while men become less interested and feel like they may not belong.

“This shows us that, not only can we actually change how people think about the world and their place in it by changing the algorithmic content that they see, but it also tells us that those effects are not going to be standard for everyone and it's not sufficient to just say we have one default user and that's who we consider,” Metaxa said.

They added, however, that removing bias from algorithms alone will not solve the problem of racial and gender inequality.

“It's important to look at the way that algorithms reflect society, but it's also important to understand that there are some social patterns and social issues that we can't just erase with some technical face,” they added.

Watts and Metaxa also discussed the significance of the intersection of social science and computer science, two very different branches of study. 

“That's something that I love about Penn,” Metaxa said. “Just the fact that there are collaborations and connections between many different disciplines at all of the schools, but especially that it seemed like there was a home for me at Annenberg as well as in CIS.”

When Metaxa joins the faculty at Penn, they will join Watts in holding appointments in CIS and Annenberg.

The conversation closed with a discussion about establishing more inclusive communities among researchers and in other professional fields. The speakers discussed identity, perception, and representation in the workplace, as well as Metaxa’s personal experiences as a non-binary person.

Metaxa said that it’s important for the professional community to prioritize diversity for the right reasons, not just to help fields generate knowledge.

“We [care about diversity] because it's the right thing to do, because it's unjust that massive sets of people have been systematically disenfranchised,” Metaxa said.