As humans, it is almost impossible not to invoke biases towards certain people, places and things. Stereotypes surrounding race and gender are so embedded in our daily interactions, inherent learning and decision making that our subjectivity often leads to scrutinizing certain groups, whether consciously or not. Ingrained societal biases and discrimination are further exploited through algorithms in the digital world.
The deep machine thinking and learning of algorithms replicate human thought which leads to the upkeep of societal stereotypes about specific people, mainly those who are already marginalized in the real world. In the case of race and gender, algorithms disproportionately target minority groups, enhancing stereotypes through search engine rhetoric, data collection and user enhanced technology. The omnipresence of algorithms upholds racist and sexist ideologies in a concealed process, perpetuating society’s already stigmatized perceptions of women, women of color and people of color.
While decision making systems benefit data driven companies and organizations, the ability to access data ends up endangering the livelihood of many minority group members. Certain groups are targeted for poverty, crime or just general surveillance. Those who are surveilled are predominantly people of color, exemplifying the inherent racism reflected through algorithms. The collection of information through algorithms establishes a “red-flagging” system, one which forms a feedback loop of injustice and silences marginalized voices. On the other hand, the voices of white men, those who play the largest role in the creation of algorithms, are magnified.
In the case of the HP computer, the viral YouTube video “HP Computers Are Racist” shows the computer’s facial recognition software being unable to identify Desi Cryer’s Black face. In the video, Cryer can be heard saying, “I’m Black. . . . I think my blackness is interfering with the computer’s ability to follow me.” It is through this seemingly inadvertent notion of whiteness in HP’s facial recognition technology that white privilege is exemplified, strengthening the relationship between the societal advantage of whiteness on both the internet and in the physical world. When whiteness is seen as the default, all else becomes the “other.”
The language imposed upon Black and white women through Google’s prioritization of web results upholds the negative falsehoods associated with Black women and continues to deem white femininity as the ideal standard of beauty. Within the confines of whiteness also comes the presumption of purity which allows white women to not be inherently sexualized the same way Black women are. Without any explanation, Black girls are explicitly sexualized in search rankings. This is problematic as Google’s ranking normalizes the idea that Black women and pornography go hand and hand as the public associates top search results with being the most popular, most credible or both.
By replicating human thought, algorithms conflate the physical world and the digital world, creating a hierarchical regime where white male domination still controls how we are supposed to act and think, however this time their control is masked. Algorithms have become so fixed in society that the fact that our thoughts, identities, and information are all on display, allowing companies to exploit our data for their own profit and prestige, rarely crosses our mind.
The algorithmic presentation of women online, particularly women of color, and people of color, mimics what is seen in society today resulting in minority groups to be further marginalized. While seemingly unintentional and neutral, algorithms perpetuate stereotypical societal ideologies of both race and gender.