Her Campus Logo Her Campus Logo
This article is written by a student writer from the Her Campus at Wisconsin chapter.

As humans, it is almost impossible not to invoke biases towards certain people, places and things. Stereotypes surrounding race and gender are so embedded in our daily interactions, inherent learning and decision making that our subjectivity often leads to scrutinizing certain groups, whether consciously or not. Ingrained societal biases and discrimination are further exploited through algorithms in the digital world. 

The deep machine thinking and learning of algorithms replicate human thought which leads to the upkeep of societal stereotypes about specific people, mainly those who are already marginalized in the real world. In the case of race and gender, algorithms disproportionately target minority groups, enhancing stereotypes through search engine rhetoric, data collection and user enhanced technology. The omnipresence of algorithms upholds racist and sexist ideologies in a concealed process, perpetuating society’s already stigmatized perceptions of women, women of color and people of color.

five women sitting on or around an orange couch
Photo by RF._.studio from Pexels
In the last decade, the societal reliance on algorithms to make decisions for us has increased substantially. Predictive search engine results, facial recognition technology and recommended advertisements ease the stress and strain of our lives, granting the ability to rarely reflect on the harm the algorithmic technology is imposing on society. On the internet, we are no longer people but rather a compilation of data that allows algorithms to make assumptions about our identity for the sake of power and profit. In Google’s case, browsing data assigns the user with an algorithmic gender through the binary sex lens of “male” or “female” for purposes of marketing and the sake of profitable convenience.

While decision making systems benefit data driven companies and organizations, the ability to access data ends up endangering the livelihood of many minority group members. Certain groups are targeted for poverty, crime or just general surveillance. Those who are surveilled are predominantly people of color, exemplifying the inherent racism reflected through algorithms. The collection of information through algorithms establishes a “red-flagging” system, one which forms a feedback loop of injustice and silences marginalized voices. On the other hand, the voices of white men, those who play the largest role in the creation of algorithms, are magnified.

laptop open to Google search bar
Photo by Benjamin Dada from Unsplash
In 2015, Silicon Valley’s top technology companies had only 4.7 percent Hispanic and 2.2 percent Black workers employed, respectively, a number that shrinks when considering top executives. This is evidently a major flaw in these large conglomerates which pride themselves on user-friendliness, diversity and inclusivity. The irony is that these technology companies inevitably do more harm than good when it comes to algorithmic softwares, especially in terms of race. In 2017, the iPhone facial recognition software was unable to differentiate between Asian faces; in 2015, two Black teenagers were labeled as gorillas through Google Photos; and in 2009, HP computers were deemed racist for not being able to pick up on darker skin tones with their facial tracking feature

In the case of the HP computer, the viral YouTube video “HP Computers Are Racist” shows the computer’s facial recognition software being unable to identify Desi Cryer’s Black face. In the video, Cryer can be heard saying, “I’m Black. . . . I think my blackness is interfering with the computer’s ability to follow me.” It is through this seemingly inadvertent notion of whiteness in HP’s facial recognition technology that white privilege is exemplified, strengthening the relationship between the societal advantage of whiteness on both the internet and in the physical world. When whiteness is seen as the default, all else becomes the “other.”

selective focus photo of a gold iPhone 6s home screen
Photo by Benjamin Sow from Unsplash
Due to whiteness and masculinity acting as the “norm” in society, arguably, Black women are put under the most scrutiny in both the real world and digital world. Through the intersection of Blackness and femininity, the racist and sexist experiences of Black women are amplified. In a study conducted by feminist Safiya Noble, it was found that when “Why are Black women so” was entered in a search engine, results such as angry, lazy, sassy, mean and loud were some of the predicted results. However, when the same question was posed with “white” replacing “Black,” white women were denoted as pretty, beautiful and skinny. 

The language imposed upon Black and white women through Google’s prioritization of web results upholds the negative falsehoods associated with Black women and continues to deem white femininity as the ideal standard of beauty. Within the confines of whiteness also comes the presumption of purity which allows white women to not be inherently sexualized the same way Black women are. Without any explanation, Black girls are explicitly sexualized in search rankings. This is problematic as Google’s ranking normalizes the idea that Black women and pornography go hand and hand as the public associates top search results with being the most popular, most credible or both.

black women smiling and hugging
Photo by Rodnae productions from Pexels
Top search results also harm women in general as algorithms aid in reinforcing the conceptualization of women as domestic. Upon searching the phrase “professional style,” Noble found that men, most of whom were white, dominated Google Images. Additionally, while women comprise over half of Internet users, women’s thoughts and opinions do not emit the same impact on the internet as those of men. Online, women are seen as objects, enhancing women’s subordination and emulating the structural inequalities of the real world into digital form.

By replicating human thought, algorithms conflate the physical world and the digital world, creating a hierarchical regime where white male domination still controls how we are supposed to act and think, however this time their control is masked. Algorithms have become so fixed in society that the fact that our thoughts, identities, and information are all on display, allowing companies to exploit our data for their own profit and prestige, rarely crosses our mind.

The algorithmic presentation of women online, particularly women of color, and people of color, mimics what is seen in society today resulting in minority groups to be further marginalized. While seemingly unintentional and neutral, algorithms perpetuate stereotypical societal ideologies of both race and gender. 

Peri Coskey

Wisconsin '21

Meet Peri! She's a senior majoring in Communication Arts and Sociology with minors in Digital Studies, Gender and Women's Studies and Entrepreneurship. Her favorite things to do are watch Veronica Mars, thrift shop and chill with friends. When Peri is not taking naps, she can be found hanging out with her friends, most likely talking their ears off. Interested in seeing more of Peri's work? Check out pericoskey.com!
Kate O’Leary

Wisconsin '23

Kate is currently a senior at the University of Wisconsin Madison majoring in Biology, Psychology and Sociology. She is the proud co-president of Her Campus Wisconsin. Kate enjoys indoor cycling, spending time with friends, cheering on the Badgers and making the absolute best crepes ever!