Deepfakes.
Photo credit: theFreesheet/Nano Banana Pro

A new survey commissioned by police has revealed that a quarter of people believe creating and sharing sexual deepfakes without consent is either acceptable or a neutral act, prompting warnings from law enforcement about the “epidemic” of online violence against women and girls.

The findings, published by the Office of the Police Chief Scientific Advisor, highlight a disconnect between public perception and the reality of digital abuse. While the prevalence of sexual deepfakes is estimated to have increased by 1,780 per cent between 2019 and 2024, many respondents viewed the offence as less harmful than physical crimes such as phone theft.

Of the 1,700 respondents surveyed by Crest Advisory, almost one in six admitted they had created a deepfake or would do so in the future. Among those who had created deepfakes, 34 per cent had generated sexual or intimate content featuring someone they knew.

“Sharing intimate images of someone without their consent, whether they are real images or not, is deeply violating,” said Detective Chief Superintendent Claire Hammond from the National Centre for VAWG and Public Protection. “The rise of AI technology is accelerating the epidemic of violence against women and girls across the world.”

Misogyny driving offences

The survey identified a correlation between misogynistic beliefs and the acceptance of deepfake abuse. Respondents who considered it morally and legally acceptable to create, share or sell non-consensual sexual deepfakes were more likely to be men under the age of 45 who actively consume pornography.

Younger demographics were generally more likely to find the creation of such content morally acceptable compared to older generations. In a specific scenario involving an individual creating an intimate deepfake of a partner and sharing it after an argument, 13 per cent of respondents deemed the action legally and morally acceptable. In comparison, a further nine per cent remained neutral.

“We are looking at a whole generation of kids who grew up with no safeguards, laws or rules in place about this, and now seeing the dark ripple effect of that freedom,” said Cally-Jane Beech, an activist campaigning for better protection for victims. “Stopping this starts at home. Education and open conversation need to be reinforced every day if we ever stand a chance of stamping this out.”

Barriers to reporting

Police are currently exploring technical solutions to improve reporting rates, which remain critically low. Data from the Revenge Porn Helpline indicates that only 4 per cent of people who report abuse to the helpline go on to report it to the police.

To address this, forces are trialling ‘image hashing’ technology. This process allows investigators to use a digital description of an image rather than sharing the image itself, potentially sparing victims the distress of having the material shown in court.

“Technology companies are complicit in this abuse and have made creating and sharing abusive material as simple as clicking a button, and they have to act now to stop it,” said Hammond. “However, taking away the technology is only part of the solution. Until we address the deeply ingrained drivers of misogyny and harmful attitudes towards women and girls across society, we will not make progress.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Employees happiest with ‘moderate’ AI as excessive automation triggers anxiety

Implementing artificial intelligence in the workplace boosts employee morale — but only…

Forced office returns risk widening Europe’s regional inequality gap

Corporate mandates forcing staff back to desks threaten to reverse work-life balance…

Super-intelligent AI could ‘play dumb’ to trick evaluators and evade controls

The dream of an AI-integrated society could turn into a nightmare if…

Ambient AI restores eye contact to medicine by slashing clinical burnout

Ambient artificial intelligence is restoring the human connection to medicine by liberating…

‘Breathing’ robots transmit fear through touch alone as humans catch panic

Humans can “catch” fear from machines, according to new research, revealing that…

“Parasocial” crowned Cambridge Word of the Year as fans fall for AI chatbots

The rise of one-sided emotional bonds with artificial intelligence has driven Cambridge…