Her Campus Logo Her Campus Logo
Culture > News

The Looming Threat of Deepfakes and Why We’re Not Talking About it Enough

This article is written by a student writer from the Her Campus at SDSU chapter.

Forget the ‘Fake News’ craze for a minute: Deepfakes are the new threat to democracy, to truth and especially to privacy… and women are the main targets. 

Deepfake videos, named after the Artificial Intelligence “deep learning” technology used to create them, have become increasingly popular since they first emerged on the web in late 2017. They use manipulated software to create or alter video content to display events that never actually took place, and it’s getting harder to distinguish the fake from the real. 

 

Taken from Bloomberg’s YouTube video (Sept. 27, 2018).

 

Since then, Amsterdam-based cyber security company Deeptrace Labs has tracked the rapid rise of deepfake videos online and the results are alarming: there has been an 84 percent increase in deepfake videos available on the internet since last December, growing from 7,964 to nearly 15,000, CNN reports. And the numbers keep rising.

 

So what exactly does this mean? 

While there is considerable panic online in regards to the potential threat deepfake technology poses to presidential elections, political campaigns and political disinformation, a new report by Deeptrace states that 96 percent of deepfakes circulating on the web are pornographic. 

Not surprising, since deepfake roots stem directly from a Reddit account called deepfakes, which began posting fake porn videos in November 2017 created using software that morphed the faces of the real performers with the faces of well-known female celebrities. 

While this falsehood campaign can affect anyone, women are frequently the primary targets of this “non-consensual” pornography that is quickly spreading across all internet platforms. 

The report from Deeptrace also notes that the technology used to create fake sexually explicit content is becoming increasingly accessible—an example being the computer app DeepNude, which can undress the fully-clothed photo of any woman. 

 

And if that isn’t scary or disturbing enough, the report’s authors told Quartz that, “The software will likely continue to spread and mutate like a virus, making a popular tool for creating non-consensual deepfake pornography of women easily accessible and difficult to counter.”

This means that women are once again at the forefront of a fake media movement meant to belittle, sexualize and expose them, all while forcing women into a vulnerable position by invading their privacy.  

 

Sound familiar? Probably because women are the main targets in any sexual attack, and this is increasingly a repeating narrative thanks to the rapid growth of the internet.

The key factor in all of this deepfake panic swirls back to an important conversation that women are just recently regaining control of: the topic of consent. 

While deepfakes have the power to potentially create fake news, push forward a problematic agenda and incite violence within countries and social groups, they can also ruin reputations, particularly of women, due to their main focus on pornography. 

 

Rana Ayyub, an investigative journalist and writer from India, experienced this firsthand. Writing for The Huffington Post, she detailed how she was targeted in a deepfake porn plot intended to silence her due to her public disapproval of India’s decision to protect a child sex abuser. 

 

 

Image via Rana Ayyub’s Twitter (December 30, 2018).

 

“The entire country was watching a porn video that claimed to be me and I just couldn’t bring myself to do anything,” she wrote. Local law enforcement was not willing to help Ayyub, either. Eventually, the United Nations intervened but it was too late.

 

“Now I don’t post anything on Facebook. I’m constantly thinking what if someone does something to me again. I’m someone who is very outspoken so to go from that to this person has been a big change. I always thought no one could harm me or intimidate me, but this incident really affected me in a way that I would never have anticipated,” she continued.

 

As Ayyub’s story proves, the effects of being a victim of a targeted deepfake attack are significant and often traumatizing. And while major news outlets have written hundreds of articles about the topic of deepfakes, college students seem to not care. 

 

Why?

Particularly because most news coverage about deepfakes center around two things: wealthy politicians who possess the means to defend themselves against the threat, and its potential to intervene in electoral politics

The average college student might think ‘Well how does this affect me?’ and truth is, it really doesn’t. However, the real threat comes from what is so little discussed: how deepfakes can affect regular people too, such as women, people of color and members of the LGBTQ community. 

Often times, these particular groups are targeted in malicious attacks because of their lack of protection and low-income majority.  

 

As the New York Daily News reports, “the people who are most vulnerable to being targeted by deepfakes are those without the means to control what counts as evidence about them.” 

And while California is trying to hinder the increased threat of deepfake media by passing a law allowing residents to sue if their image is used for sexually explicit content, this law “will face a number of roadblocks,” says Jane Kirtley, a professor of media ethics and law at Hubbard School of Journalism and Mass Communication. 

Speaking to The Guardian, Kirtley says she is skeptical California will be able to truly enforce this law, mainly due to free speech protections that favor political speech. 

 

The overall theme of this article is rather simple: Women are the victims of a harsh crime that is increasingly becoming more powerful, more invasive and harder to defeat. 

Self-proclaimed #BossBabe and #MemeQueen, but in my free time you can catch me at the gym or at a local coffee shop. I am a third year Journalism and Public Relations major and Honors student, and I believe the best things in life come with hard work. Things I like to do: shop, write (obviously), listen to music, lounge at the beach, try new restaurants and hang out with friends and family. I am a self-love advocate with a passion for adventure and motivating others. My go-to motto: Life is too short for bad coffees ☕
Emily is the Campus Correspondent for Her Campus SDSU. She is a 4th year journalism student from Chicago, IL. At SDSU, she is in Kappa Delta, is the Social Media Director of Rho Lambda and the Vice President Membership of Order of Omega. Emily's favorite hobbies are dancing, online shopping, planning out her Instagram feed, blogging and going to Disneyland. On a daily basis, you can find her glued to her laptop writing blog posts and editing Youtube videos. In the future, she wants to work for the Walt Disney Company on their social media marketing and communications corporate team. Emily's strong passion for digital media & content creation makes her very proud to be a member of the Her Campus team!