A girl sat at a restaurant table looks at her phone in shock.

Olivia Hughes

Content warning: This article contains details of intimate image abuse and sexual offences.


Revenge porn has become a weapon against women, in an era increasingly reliant on technology.

Over 10,000 reports were made to the Revenge Porn Helpline in 2023, a 31 per cent rise since the previous year. To handle the growing problem, England and Wales introduced a new criminal offence to make revenge porn cases easier to prosecute. But most offenders still manage to avoid charges.

Artificial intelligence and deepfakes provide a greater challenge than ever. While AI-generated pornography of Taylor Swift has made headlines, she isn’t the only woman to be exploited through deepfakes.

Previously, revenge porn relied on existing explicit images of the victim. Now, they can be generated without their knowledge.

What Is Revenge Porn?

Revenge porn — officially known as intimate image abuse — is the sharing of private or sexual materials of another person without their consent, in the form of photos or videos. Typically, these images are shared with the purpose of causing embarrassment or distress. 

But the term ‘revenge porn’ is problematic.

A practitioner at the Revenge Porn Helpline, Georgia Street, explains that the service uses the term so victims can find the helpline easily. But the correct phrase to use is intimate image abuse.

The use of language, such as the word ‘revenge’, can further undermine a victim’s experience.”

Street explains: “We don’t support that term [revenge porn] being used because it perpetuates the idea that victims of this offence deserve what has happened to them, by referencing ‘revenge,’ and insinuates that the image or video was consensual as that is the assumption of ‘porn’.” 

The term ‘revenge porn’ highlights the derogatory treatment of victims of intimate image abuse. Victims are often blamed for taking and sending explicit images. The use of language, such as the word ‘revenge’, can further undermine a victim’s experience, suggesting they deserved to be exploited. 

‘Revenge porn’ fails to encapsulate the vast range of different forms this crime takes. Sexploitation (in films or TV shows) and AI-generated images also fall under the umbrella of intimate image abuse.

Intimate Image Abuse Becomes A Criminal Offence

On January 31st 2024, a new law was introduced in England and Wales, adding intimate image abuse to the Sexual Offences Act 2003. This act already covers a range of sexual offences, including sexual assault, trafficking, and incest. 

The change to law means a person’s reason for distributing an intimate image is not considered by the prosecution. Instead, the fact that the offender distributed the images at all is enough for a conviction. Many hope this will make it easier to charge perpetrators guilty of intimate image abuse.

“Today, everyone has images of themselves on the internet, there really is no way to protect yourself from this crime.”

Between 1st January 2019 and  31st July 2022, the Crown Prosecution Service recorded 13,860 intimate image abuse offences. But the majority of offenders weren’t charged. From January 2019 to July 2022, the alleged offender was only charged or summonsed in 4 per cent of cases. 

The law change may make it easier to avoid “evidential difficulties,” which make it difficult to charge perpetrators.

Deepfakes: A New Type Of Revenge Porn

Under the new law, deepfakes are included as intimate image offences. This makes it a criminal offence to make and publish an AI image that uses the unauthorised image of a person. 

Megan Oldham, a Research Assistant within the University of Manchester Criminology department, is particularly concerned about the rise of deepfakes.

“Deepfakes are scary because they rely on images being on the internet. Today, everyone has images of themselves on the internet, there really is no way to protect yourself from this crime,” says Oldham.

Research shows the number of deepfakes doubled from 2018 to 2019. The increased accessibility of AI generators means this number has only increased. Nearly all deepfake videos were pornographic and non-consensual.

How Can We Combat Deepfakes?

According to Oldham, there has been an arms race to create AI detection software. This would make it easy for a computer to spot and remove a fake image as soon as it appears online.

But nobody is winning this race. As quickly as AI can detect an image, the software which creates a deepfake can be modified to ensure more fake images can constantly be produced.

Oldham adds: “The deterrence and removal of deepfakes requires an interdisciplinary approach. Just using a counter AI software is not going to work. We need real action from policy makers to protect victims.”

Intimate Image Abuse On Social Media

“With the internet making intimate images easier to source, create and share, the number of victims is rising.”

The global nature of the worldwide web, as well as the anonymity it provides, makes it difficult to establish laws and charge criminals. Victims remain largely unsupported when it comes to online crimes. This includes intimate image abuse, but also threats, abuse, stalking, hacking and grooming.

“We see more and more cases come through the helpline each year. Often, when there is a big case in the media, we see more victims coming forward,” explains Street.

Cases of intimate image abuse in the media are relatively novel. One of the most infamous is Love Island star Georgia Harrison, who took reality star Stephen Bear to court for releasing an explicit film of her online. There was a 56 per cent rise in calls to the Revenge Porn Helpline in the month he was sentenced.

However, Bear served just 10 months of his 21 month prison stay before being released in January 2024. 

Of course, Harrison isn’t the first celebrity to have a ‘sex tape’ released. A pornographic video of Paris Hilton was released in 2004, without her consent, and caused a massive media stir. She is just one of many female celebrities who became a victim of exploitation in the early 2000s and is still impacted by the video today.

Hilton said of the incident: “It’s always there in the back of my mind. When it happened, people were so mean about it to me. The way that I was spoken about on nightly talk shows and the media, to see things with my family was just heartbreaking.

“I would be in tears every single day, I didn’t want to leave my house, I felt like my life was over.”

But the difference between these landmark cases demonstrates how public reactions to intimate image abuse has changed. While Hilton was mocked and parodied, people were largely sympathetic towards Harrison. This suggests we can move towards better protecting, instead of stigmatising, future victims.

Are Victims Supported?

Support for victims of intimate image abuse is playing catch up with the fast-adapting nature of the offence.

The Revenge Porn Helpline has been operating since 2015, offering advice and help in the process of removing content from the internet. But this is the only service in the UK offering support to victims.

Looking to the future, Street and Oldham agree there is a greater need for visibility and awareness.

“There’s a lot of shame around this type of crime. As a society, we want to move away from that stigma that victims are to blame for the actions of offenders, because this is a truly devastating crime that ruins lives,” says Street. 

In 2023, revenge porn still isn’t taken seriously enough.

Victim blaming remains a systematic issue, which makes it difficult for people to seek what little support exists. Despite the introduction of new laws, there are insufficient resources to combat the crime.

And, with the internet making intimate images easier to source, create and share, the number of victims is rising. We need to learn, as we transition to the internet age, how to tackle cybercrimes. 

If you have been impacted by intimate image abuse you can access support and advice from the Revenge Porn Helpline.

READ NEXT:


Featured image courtesy of Justin Snyder on Unsplash. No changes were made to this image. Image licence found here.

Leave a Reply

Your email address will not be published. Required fields are marked *