Her Campus Logo Her Campus Logo
Tinder?width=719&height=464&fit=crop&auto=webp
Tinder?width=398&height=256&fit=crop&auto=webp
Wellness > Sex + Relationships

How the Algorithms of Dating Apps Reinforce Racial Bias

The opinions expressed in this article are the writer’s own and do not reflect the views of Her Campus.
This article is written by a student writer from the Her Campus at American chapter.

Dating changed during the pandemic. More people resorted to dating apps to connect with people and because  Tinder reported that 2020 was its busiest year, and Hinge tripled its revenue from 2019 to 2020. 

When someone swipes on dating apps, it is safe to assume that they are not paying attention to how the app’s algorithm works. Since the use of dating apps in the past year skyrocketed, it is important to question the effects of dating apps and racial identity.

Grindr, a dating app that is primarily used by gay and bisexual men in the LGBTQ+ community, previously was one of the dating apps that asked users about their ethnicity when they joined the app. However, in support of last year’s protests in response to the murder of George Floyd, Grindr decided to remove users’ ability to filter out their matches via ethnicity because they wanted to do their part in making the country less racially segregated by making the dating scene less racially segregated.

Although Tinder and Bumble, two of the most popular dating apps do not outright filter by ethnicity, racial bias still creeps into the algorithm. A spokesperson for Tinder stated that Tinder does not collect data on users’ ethnicity or race; therefore, race has no role in the algorithm, and they only show people that meet the users’ gender, age, and location preferences. However, according to Tinder users, there are rumors that the app measures its users’ terms of relative attractiveness, and although there is no definition of would be considered beautiful on Tinder, it does reinforce society-specific ideals of beauty, which in turn are prone to racial bias.

Because of the Black Lives Matter (BLM) protests of last year and the pressure dating apps have faced, other apps like OkCupid,a dating site that is meant to target an older generation, have created a BLM hashtag so people can add it to their profile, and Bumble has even added a BLM filter

In response to the scrutiny they have been under, there has been no significant change to the way that apps operate; there have been no changes to the actual algorithms. 

Jessie Taft, a research coordinator at Cornell Tech, and her team decided to download the 25 most popular dating apps (based on the number of iOS installs as of 2017), which included apps like OkCupid, Grindr, and Tinder to determine whether or not the algorithms were racially biased.

Taft and her team looked at the apps’ terms of service, their sorting and filtering features, and their matching algorithms to see if the apps’ designs and functionality decisions could affect bias against people of marginalized groups. Taft and her team found that matching algorithms are often programmed in ways that define a “good match” based on previous “good matches.”

According to Taft, this meant that if a person had several good Caucasian matches, then the algorithm is more likely to show Caucasian people as “good matches” in the future. This affects people who use the apps in the future because algorithms often take data from past users to make decisions about future users. Therefore, if past users made any discriminatory decisions, it would indirectly affect future users and those discriminatory decisions would just continue. 

It is important for people to take a minute and think about the algorithms of the dating apps they use because although the majority of the people are just swiping, the algorithms can be based on previous discriminatory decisions.

Katherine (she/her) is a second-year student at American and is majoring in Political Science. Katherine loves to write about current events, relationships, and politics. She is currently living in Washington DC.