Her Campus Logo Her Campus Logo
Nottingham | Culture > Digital

How the AI Revolution is reinventing misogyny

Zalia Robertson Student Contributor, University of Nottingham
This article is written by a student writer from the Her Campus at Nottingham chapter and does not reflect the views of Her Campus.

The rise of AI is everywhere, it is working its way into workplaces and education, with
people using AI every single day, often ignoring the consequences this can have. Not only
does AI use an incredibly high amount of water, with experts believing that AI related
infrastructure may soon consume six times more water than Denmark, a huge problem when
considering a quarter of humanity already lacks access to clean water and sanitation, but a
request made through Chat GPT consumes 10 times the electricity of a google search.

However, there is an even darker more sinister side to the rise of AI, and no one seems to be
talking about it.


In 2012, Laura Bates launched the Everyday Sexism Project, creating a space for women to
share stories that the world refused to listen to, stories of street harassment, workplace
discrimination and online abuse.
On the 15 th May, Bates released her sixth non-fiction book,
The New Age of Sexism: How AI is Reinventing Misogyny, a disturbing but necessary
examination of how the next frontier of misogyny is being programmed, uploaded and
gamified. As Bates states, this is not a book about the future, this is happening right now. And
she’s right. The rise of AI is not only shaping our technologies, its warping our cultural
norms, weaponizing emerging tech to reinforce old patterns of gendered violence in terrifying
new ways.


One of the most immediate threats in the rise of AI comes in the form of deepfakes, AI-
generated videos and images that can place someone’s face on another person’s body, often in
pornographic content. The content that is generated through deepfakes is scarily realistic,
with even those close to the victim unable to tell that the content is not real and AI-generated.
The use of deepfakes isn’t just a digital hoax, they are identity thefts, used to humiliate and
silence women. Through her work for The New Age of Sexism: How AI is Reinventing
Misogyny, Bates uncovered it takes just 10 minutes and a few clicks to generate a realistic
deepfake
, often using tools that are available to the public for free on app stores, presenting a
scary fact that anyone can create deepfake content about anyone, often with ease. Victims are
routinely dismissed, told ‘it’s not real’ or to ‘be grateful it wasn’t worse’, but these images
travel fast. Across group chats, school corridors, and anonymous forums, destroying
reputations and mental health with chilling efficiency. Despite the scale of the problem,
current legislation is woefully behind. Smaller websites hosting deepfake porn face little
regulation, and even governments are reluctant to act. At a recent summit in Paris, where
nations proposed ethical AI guidelines, both the UK and the US refused to sign, citing
‘national security’ as a higher priority than the safety of women.

The situation is especially prevalent for teenage girls. Deepfake abuse in schools is rising
rapidly
, but those who create and share these images often face no meaningful consequences,
with schools instead pooling their resources around PR, hoping to preserve reputation while
not doing enough to prevent these instances from occurring in the first place. Research shows
that this pattern is repeating all the time, and it is happening far more often than we are led to
believe.

Meanwhile, in the darkest corners of the metaverse, a new horror is being born: cyber
brothels and AI sex robots. At Cybrothel Berlin, the worlds fist sex doll brothel which blends
cutting-edge AI, VR and robotics to create a ‘new kind of adult experience’, you can order a
sex robot in advance of your arrival, customise her appearance, even request that she be
bloodied, bruised or torn.
During the research of her book, Bates visited and requested that
the doll be prepared to appear as if she had been attacked. They fulfilled her wish. Upon
arrival Bates described the doll looking terrifyingly real, curled up on the bed facing the wall,
battered, bruised and with its fingers trembling. One of her labia had been torn off. And no
one- not a single person- was monitoring what was happening behind that door.
It doesn’t stop there. High tech sex robots are now available for purchase. These robots can
be completely customised, with over 100 different nipple sizes to choose from. Not only this,
but they can be completely made to look like a real person, down to skin tone, breast size and
body shape. You can send of an image to these companies of the person you would like to
replicate in doll form, and they will do this for you. Terrifyingly, this can be done without that
person’s knowledge. It could be an ex-partner, a classmate, a victim of stalking, someone can
have an exact replica of you made into an AI sex robot and you would have no idea. There
are even customisable settings on the robots. One setting called ‘Frigid Farrah’ allows the
user to simulate rape, with the robot repeatedly saying “no” throughout the act, opening up
horrific new possibilities for stalking and abuse.


The sex tech industry is worth over $30 billion, but little to no attention is being paid to how
it is fuelling violence.
AI girlfriends and chatbot apps are marketed as tools to help lonely
men, but often become spaces where abuse is gamified. Many include features that allow
users to simulate violent sex, including rape scenarios. These apps have been downloaded
over 100 million times. The justification? Some claim these technologies help prevent real-
life violence, but evidence suggests the opposite. A recent investigation into AI child abuse
material found that exposure to these simulations increased the likelihood of escalation into
real-world offending. These tools don’t reduce violence- they normalise it.


Sexualised violence is rising. Data shows the fastest growing groups of domestic abuse
offenders and victims of domestic abuse are those aged 16-19. 41% of UK girls aged 14-17 in
an intimate relationship experienced some form of sexual violence from their partner.
The
police receive a domestic abuse related call every 30 seconds, yet it is estimated that less than
24% of domestic abuse crime is reported to the police. Crimes including stalking, harassment,
sexual assault and domestic violence affect one in twelve women in England and Wales, with
the number of recorded offences growing 37% in the past five years and the perpetuators are
getting younger. Keir Starmer’s vague promise to “halve violence against women and girls
within a decade” rings hollow without concrete policies.


Violence against women and girls is no longer confined to dark alleyways, its coded in the
apps we use, uploaded to the platforms we trust and sold under the guise of ‘sexual wellness’.
But Bates is clear: this isn’t inevitable. Misogyny thrives in silence, and the antidote is loud,
public exposure. We need government-backed regulation. We need sex education that matches the realities children of today face. We need to stop laughing at men like Trump and Musk and start
recognising the real damage they enable. We need to stop treating women’s safety as a
bargaining chip in trade deals. Because this isn’t the future, this is now.

Zalia Robertson

Nottingham '25

Zalia is a third year International Media and Communication Studies student at the University of Nottingham. She enjoys writing about a range of topics with a particular focus on fashion, gender, film and pop culture. Zalia is excited to develop her interest in writing, whilst gaining experience that she hopes to develop post-grad. In her free time Zalia enjoys reading, writing and shopping, spending most of her weekends dragging people to car boot sales or vintage markets.