What are Algorithms of Oppression?

Consider the Google Search engine: how might the results we get influence our impression of the world and those around us? How much do we really know about how those results are derived? Our primary gateway to information has increasingly shifted towards results catered by for-profit web search engines run by the largest tech companies in America. In her acclaimed 2018 text Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble unveils the sexist, discriminatory biases perpetuated by algorithms applied in search engines created by Google and related companies.

Many researchers have critiqued the argument that algorithms behind search engines are benign or objective; Noble reinforces their criticisms with a significant insight: these “automated decisions” made by algorithms are still designed by human beings who “hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy”. In particular, Noble focuses on degrading stereotypes of black women and girls as the primary example of these prejudices. Algorithms used by search engines are structured such that they reflect hegemonic narratives, resulting in the structured oppression of women and people of color. Noble argues that this digital mode of racial- and gender-based profiling, which she calls “technological redlining”, can only be prevented by breaking up and regulating large technology monopolies like Google.

Algorithms of Oppression is divided into two major threads: a rejection of the Internet as a “cybertopia”- a utopian space free of marginalization - and an inquiry into how information scientists can develop new methods that contribute to inclusive, fair classification systems. Noble’s claim that the net is inherently biased is bolstered by countless disturbing examples.

Typing “black girls” into Google Search gives results that are horrifyingly racist and pornographic. A search with the keywords “three black teenagers” offers mugshots of African American teenagers, while “three white teenagers” are consistently represented as wholesome and the “default ‘good’”. How does this happen? Google’s PageRank algorithm didn’t simply appear out of nowhere – it was designed, coded, and moderated by those with biases. Combined with the fact that Google doesn’t weigh page relevance by its credibility or objectivity, the accounting of information on a search engine results in a sort of “social hegemony” that serves to conserve dominant ideologies and mutually reinforce users’ pre-existing biases. Algorithms are a product of human engineering and will always influenced by those with certain values.

After confirming that the dissemination of information is far from neutral, Algorithms of Oppression suggests that we, including information professionals, should all work to reimagine an information culture with more equitable classification systems that do not disregard certain communities or individuals. A significant factor of algorithmic oppression is the work of library and information scholars and its implication in the algorithmic process of information classification. These systems were the precursors to modern commercial search engines and often participate in the oppression of historically excluded people. Noble notes the ethical concerns involving digitization projects that publish information about individuals without their permission, especially when that information misrepresents marginalized communities. Thus, including fields such as gender studies and African American studies in the process would allow research professionals to better understand the concept of information classification involving race and gender.

The subject matter of Algorithms of Oppression is extremely relevant to the modern political and social climate, as well as to the increasing reliance on digital search for information access. Although the future of big data and AI seems optimistic, it’s still necessary to take a closer look at the personal and wide-ranging consequences of a widespread commercial search engine. Noble’s ardent call for increased regulation and governance of commercial search engines like Google is impactful and well-reasoned, making a strong case for the repossession of information search and access under democratic, publicly accountable terms.