Skip to content
News

Rutgers research shows gender bias in media relating to different occupations

Vivek Singh, an assistant professor of library and information science, said directly curating content can limit gender stereotypes in media.  – Photo by Rutgers.edu

A study conducted by Rutgers researchers found that biases related to gender and different occupations are prevalent in images on various media outlets, according to an article from Rutgers Today

Images depicting men and women as librarians, nurses, engineers and computer programmers generally represent or reinforce gender stereotypes, according to the article. The study analyzed pictures from Twitter, The New York Times, Shutterstock and Wikipedia and compared them to data from the Bureau of Labor Statistics (BLS) on gender representation in each of these fields.

Women were overly represented in images of nurses or librarians and were not pictured as frequently in the technical fields, according to the "Journal of the Association for Information Science and Technology." Websites like Twitter, which curate content based on an algorithm, were more likely to be biased, but sites like The New York Times and Shutterstock had more representation. 

“More direct content curation will help counter gender stereotypes,” said Vivek Singh, an assistant professor of library and information science in the School of Communication and Information, according to the article.

In the cases of female civil engineers and male nurses, the directly curated sites actually offered more images of them than would be expected based on the information from the BLS, according to the article.

The number of images of women in male-dominated professions on Twitter increased from 2018 to 2019, pointing toward more accurate gender representation in the media, according to the article.

Mary Chayko, a sociologist and interdisciplinary teaching professor at the School of Communication and Information, was a co-author on the study.

“Gender bias limits the ability of people to select careers that may suit them and impedes fair practices, pay equity and equality,” Chayko said, according to the article. “Understanding the prevalence and patterns of bias and stereotypes in online images is essential, and can help us challenge, and hopefully someday break, these stereotypes.”

The researchers said this information could help prevent developers from including gender bias in digital media platforms, algorithms and artificial intelligence, according to the article. The study’s findings can also help media outlets determine whether human content curation or algorithmic content curation is the best option, depending on the platform. 

The study was co-authored by Raj Inamdar, a research associate at Rutgers’ Behavioral Informatics Lab and Diana Floegel, a doctoral student at the School of Communication and Information.


Related Articles


Join our newsletterSubscribe