Her Campus Logo Her Campus Logo
This article is written by a student writer from the Her Campus at Bristol chapter.

Pawns in the Deepfake Game

AI assisted fake porn is here and we’re all f*ckedsays Samantha Cole. For the most part, she’s right. Deepfake or AI assisted porn uses known or ‘real’ faces edited onto computer-generated bodies in sexually explicit content. Their growing popularity signals the advent of the newest, shiny weapon in the war of disinformation. I would argue against Cole on one point however: we aren’t all doomed. In researching this perverse form of ‘entertainment’ the gendered nature of Deepfake porn has become increasingly obvious and, as with more traditional porn industries, it is clear that women are being disproportionately exploited. Female celebrities have long since been the victim of attempts to expose their privacy through attacks such as nude photo leaks, however, technological developments in the realm of Deepfakes has added extremely realistic and degrading videos to the mix.

In 2017 a reddit user: Deepfakes began devoting their platform to the creation of Deepfake porn videos of famous female actors. Maisie Williams, Taylor Swift, Aubrey Plaza, Gal Gadot and Scarlett Johansson have all been made victims of Deepfake porn. These videos are obviously extremely distressing for those that they are made of, in the case of Gadot, however, her non-consensual porno was made even more disturbing through its depiction of her participating in incestual sex with her step-brother. When you look at the list of those affected by Deepfake porn the sexism is painfully clear. Nina Schick, an expert in the field of disinformation, has stated of the hundreds of videos she found of celebrities, there was “no Brad Pitt, George Clooney, or Johnny Depp”, they were all women. Despite the trauma that comes with this kind of identity manipulation, celebrity actresses do have one saving grace. As outlined by Johansson, due to their standing in public life, people rarely believe it is actually them starring in a porno.

Unfortunately, Deepfake porn is not just targeted at female actors and women in other careers do not have this privilege. Rana Ayyub’s experience is proof of this. Ayyub is a journalist who was due to appear on the news to discuss the controversy surrounding the rape of an eight year old Kashmiri girl by a Hindu man, when a Deepfake porno of her was made public, seemingly in an attempt to undermine her politically. The video of Ayyub was circulated on WhatsApp which, due to its end-to-end encryption, is particularly difficult to trace. It was then shared on to a Bharatiya Janata Party fan page where it went viral. Ayyub was unable to get sufficient help from the police, and it was not until the UN intervened that proper attention was paid to her case. Regardless of legal help, the emotional effect of this event is clear: Ayyub has stated “I used to be very opinionated, now I’m much more cautious about what I post online.”

Googling your own name is fairly high up on the list of aimless narcissistic pursuits we’ve all undertaken at one stage or another. Sometimes we do it to ensure that future employers wouldn’t find anything untoward if they were to do the same, but mostly it’s just blind curiosity. The results are never that exciting either, usually a myriad of old social media photos, perhaps a few embarrassing pictures from school events, nothing noteworthy. For Noelle Martin, this was not the case. She describes how, upon searching an image of herself on social media her screen “was flooded with that image and dozens more images of [her] that had been stolen from [her] social media, on links connected to porn sites”. At the time Martin found that in Australia, where she is from, there was no legislation in place to prevent or to persecute against the circulation of non-consensual synthetic images or videos. Due to this, Martin attempted to approach those posting videos of her, herself requesting that they take them down. All of them refused, with one webmaster even demanding nude photos in exchange for the removal of the video. After a great deal of activism Martin was successful in having new legislation enacted in 2019, however, it only applies in Australia.

The process of having these videos removed is extremely complicated, globally the legal protocols differ massively, so, while the video can be censored from view in one place, people in other countries can still gain access to them. Johansson stated, “I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself.” This outlook, whilst depressing, is easily understood in the face of such an uncontrollable spread of disinformation. Furthermore, judging how to raise awareness about Deepfake porn is extremely difficult. I have found in researching this article, dodging the actual videos can be extremely challenging. Ayyub stated “I didn’t speak about it for a long time because I worried the larger audience would not empathise or sympathise with me but they would want to explore it more. I didn’t want Deepfake to get that kind of popularity.”

In addition to this the programmes needed to create these videos are publicly accessible, they are being made by reddit users not by special effects teams. While the sophistication of the videos is increasing at a terrifyingly rapid rate, the technology to detect Deepfakes is comparatively still in the Dark Ages. With the increasing pervasiveness of “revenge porn” the real danger that Deepfake porn, and its accessibility, poses will no doubt become harshly apparent. The terrifying fact that Martin’s case makes clear is that we, the general public, are equally at risk of having Deepfake pornos made of ourselves. Schick has argued that “it is no exaggeration to say if you have ever been recorded at any time in any form of audio-visual documentation, be that a photograph, a video or an audio recording, then you could theoretically be the victim of Deepfake fraud.” Victims of Deepfake porn videos argue that the knowledge that these videos are fake does little to comfort them, the feelings of violation that come with this kind of rape on privacy are so prevalent that for all intents and purposes, they are real.

This Deepfake pandemic is highlighting that once again women are being used as pawns: Schick’s dystopian warning “Deepfakes are coming, and we are not ready” clearly needs to be heeded.

 

3rd Year History Student at Bristol University
Her Campus magazine