Her Campus Logo Her Campus Logo
Culture > News

Can Technology Be Racist? The Big Question Of Prejudice In The Algorithms

This article is written by a student writer from the Her Campus at Casper Libero chapter.

In September of this year, the big and problematic issue of racist algorithms went viral on Twitter. When tweeting a photo with a black and a white person, in the tweet, the artificial intelligence of the platform creates the preview of the photo from a white face, because the algorithm defines that this is the most important part. This happens regardless of the person’s position in the image, and has generated great controversy and revolt by internet users. People did several tests and just “whitening” the person who is black, he gets prominence, and even with characters from “The Simpsons” the yellow doll is what gets prominence over black.

Dantley Davis, Twitter’s chief designer officer, and Parag Agrawal, chief technology officer, in an official note released by the social network about this case, wrote that although their analyzes have not shown racial or gender bias, they recognize that the way the photos are automatically cut has a “potential for harm” and they should have done a better job while designing and building this product. They also said they were conducting additional analysis to greater accuracy, exploring ways to open their code so that other people could help them hold themselves responsible, being committed to sharing their findings.

However, it is not only on Twitter that the racist algorithm is observed. They have always existed. Five years ago, an American named Jacky Alciné observed that the “Google Photos” app was identifying his black friends as gorillas. Artificial intelligence failed to distinguish animals like gorillas and monkeys from black people. After the incident, the platform apologized and said it would solve the problem, however, it only removed “gorillas”, “chimpanzees” and “monkeys” from the search engine.

In 2019, the controversy of searching Google for “beautiful braids” and “ugly braids” or “beautiful hair” and “ugly hair” surfaced, the ugly was always related to black people and beautiful to white people.

Joy Buolamwini, a Ghanaian-American industry scientist and digital activist at MIT Media Lab who founded the “Algorithmic Justice League”, which defies the bias of decision-making in software, shows in a video of her YouTube channel a mirror that can only recognize the face when the woman puts on a white mask.

Another video shows an automatic soap dispenser releasing soap only for a white hand, as it does not recognize the black one.

On the “Zoom” platform, widely used this year of the coronavirus pandemic for schools and colleges to give classes online, the case of the teacher who “lost his head” because he didn’t have his face recognized, went viral on the networks. A student who was in class tweeted: “a teacher has been asking how to prevent Zoom from taking his head off when using a virtual background. We suggest the normal background, good lighting, etc, but it didn’t work. I was in a meeting with him today when I realized why” and he caught the attention of more than 55 thousand users.

Black women are the most affected

greyscale photo of braided hair woman
Photo by Ezekixl Akinnewu from Pexels

Joy Buolamwini, at MIT Media Lab, led the Gender Shapes project, in which she assessed the accuracy of artificial intelligence products about gender. She selected 1270 images of people’s faces from three African countries and three European countries. Participants were grouped into groups by gender, skin color and intersection between gender and skin color. There was greater precision in the male groups, with 8.1% of errors compared to 20.6% of the female groups; and the same thing with white people, with 11.8% errors for light-skinned groups, against 19.2% in dark-skinned groups. As for the intersection of gender and skin color, the biggest mistakes happen with black women.

In addition, the Massachusetts Institute of Technology published a study in February 2018 that confirmed that there are many inaccuracies in facial recognition programs, with error rates of less than 1% for the face recognition of white men and up to 35% for black women.

The algorithm that targets digital influencers

facebook login on phone with social media scrabble tiles
Photo by FirmBee from Pexels
Regarding black people who produce content for social media, there is also a problem related to this algorithm. For Eliziane Berberian, digital influencer, from Rondônia in Brazil, who started her work on social networks 5 years ago, the algorithm works in a line of reasoning that when people don’t like, comment, save or engage black people posts and does that more in white people, the algorithm obviously ends up delivering more things from white people. “The issue of supremacy is perpetuated and when we direct criticism to the algorithm, it is as if a metaphor is being disseminated to the entire population”.

For her, black people work much harder to be recognized and white people sometimes do not need so much effort at work, because just being a white and public person is already more favorable to success. Eliziane says that there are many whites who work and dedicate a lot and deserve everything they have, they all deserve it, but it is the structure that makes black people work harder and not have the same percentage. “Gee, that girl over there didn’t do much, but I don’t have much to do, I have to keep working. We are black people who produce a lot and with quality, we are extremely creative. And when I refer to quality, it is within what is inside our reality, it is nothing fantastic, full of cameras and lighting. It is noticeable a dedication inside what is possible, it is very real that if the person had more he would do much more. I don’t keep creating so many parameters because otherwise I won’t have the strength to continue fighting, because this fight is very dishonest ”, said the woman.

Letícia Carvalho, 20, digital influencer, from Nova Iguaçu in Rio de Janeiro, has the same opinion as Eliziane, the woman said that she always watches white people doing the same content as her own, often without much effort and the digital platforms boost the pubs of those people much more, “while me and the other black girls we have to work twice as hard to have as little reach as possible, since social networks don’t deliver”.

Even about the value of advertising for social media, Letícia says that “even in campaigns there are more whites and one or two blacks, even looking like it is a quota”. Eliziane said that she has data collections from her digital influencing friends, who are white, because they end up commenting on prices that are higher, precisely because people share and save more. For her, white people are more familiarized to white than black people and rarely see a white person wanting to embody blacks in the systematization at the same level as them. The woman says she sees ready scripts of white people saying they are not racist, but in their hearts they feel: “today the dynamic is this, you put out something you don’t feel”.

Even in legal or life matters

In addition, court decisions, topics about finance, health, safety, the labor market, are affected by algorithmic decisions. What was widely discussed among people was that the probability of an autonomous car not recognizing the face of a black pedestrian and running over him is greater. And the probability was proven with a February 2019 study by the Georgia Institute of Technology, where a black person is more likely to be hit by an autonomous car than a white one, as the artificial intelligence of vehicles more easily detects pedestrians with lighter skin tones. In addition to issues like arrest, in January this year, a man named Robert Williams was arrested in Detroit and spent 30 hours in detention because a program that also uses artificial intelligence concluded that his face in a photo of his driver’s license and his face of a robber captured by a surveillance camera were identical.

Ads that favor whites

Research by experts at Northeastern University of Southern California and the public interest advocacy group Upturn does not reveal how Facebook’s targeting algorithms work, but it has an alarming result. On Facebook, the ad delivery process is very problematic, which lines with race and gender stereotypes and happens even when advertisers request an exhibition for a broad and inclusive audience. “Our ads for jobs in the lumber industry reach an audience that is 72% white and 90% male, our ads for cashier positions in supermarkets reach an 85% female audience, and our ads for positions in taxi companies reach a 75% Black audience , even though the targeted audience specified by us as an advertiser is identical for all three ”, wrote the researchers. Ad displays for “artificial intelligence developer” also tended to whites, while secretarial job vacancies were overwhelmingly displayed to female users.

In response, Facebook said in a statement that they are against discrimination in any way and made important changes to their ad delivery tools and know that this is just a first step. The social media also said that it is examining the ad delivery system and are engaging industry leaders, academics and civil rights experts on the same topic, exploring further changes.

The system is white

racism is a pandemic protest sign
Photo by Ehimetalor Akhere Unuabona from Unsplash
For the digital influencer Eliziane Berberian, the technological market is racist and blacks are very little represented in companies. The algorithms inherit biases and prejudices present in the databases and in the hands of the programmers who develop them. These inventors tend to be white men and this can appear in the fruit of their work.

The woman believes that if there were more black people working in this market, the algorithms would improve, but not so sure because it would depend on whether they were being manipulated, if they are deconstructed and if the system is not imposing it to direct an engagement system to white people. “It is not enough to have only the black person, you have to see who is in charge of her, who it is, how thinks,” said her.

Berberian says that the technology market, like all others, is more populated by white people due to the whole historic race. It was 132 years ago that slavery ended in Brazil and that, for a country with 500 years, is very little, so whites have always been ahead in many things. In this way, the woman says that obviously in everything, including studies, they are also in the lead, “and it ends up being a rarity to find a black man or woman who had the opportunity or who managed to face their problems to earn a rich diploma to be able to occupy such a space”.

Therefore, the influencer says that companies close their eyes to the qualified black people that exist. “We are many blacks, but poor, and when we go to certain job niches we need this support to collect the business vision for each niche, their willpower to want to look for someone qualified, because it exists, but they are in a smaller quantity, because we have always been behind. There has to be a reform in this structure”.

Is there a solution to the racist algorithm?

Black girl at computer desk writing in journal write natural work corporate african
Photo by RF._.studio from Pexels

About the question of the racist algorithm improving, Eliziane Berberian says that people are not prepared for this talk of radical change for black content producers, for example, and she always has a positive view of the Internet for black people, “but it is a mix of all in me, I keep negative things inside me, sometimes I feel, but I keep them again so I don’t get hurt ”, she said. 

It should be remembered that other groups that are also historically marginalized, such as women, non-whites, fat, homosexuals, people with less conditions are the most fearful of technology. Documents obtained by The Intercept show that the company “TikTok” reduced the reach of people who do not fit the standard, with “abnormal body shape”, “ugly facial appearance”, “obvious beer belly”, “many wrinkles”, “places like slums, rural fields”. So, it is concluded that nothing happens by chance.

Letícia Carvalho tries to hope, even every month having a crisis and wondering if she should continue. However, the woman says that her followers, who realize the injustice, always support a lot and tell her not to give up even the system trying to overthrow her: “I will not give up so easily. Discouragement comes, but it won’t be a racist algorithm that will make me stop”.

———————————————————————

The article above was edited by Gabriela Sartorato.

Liked this type of content? Check Her Campus Casper Libero home page for more!

Brazilian journalism student whose passion is a mixture of writing and listening to music. My lemma is always to be happy and respect people :)