- 57% of under-18s are also concerned about being a victim of deepfake pornography
- Despite deepfake risks, 31% of those surveyed who send intimate images admit to making their face visible in such content
- 1 in 10 reported either being a victim of deepfake pornography, knowing a victim, or both
BOURNEMOUTH, UK – March 20, 2024 – Today, ESET, a global leader in cybersecurity, reveals the findings of new research into the prevalence of deepfake pornography. The research reveals concerns about the abuse of artificial intelligence tools to create sexually explicit material of real people. The survey collected data from over 2000 Brits and found that half (50%) were worried about becoming a victim of deepfake pornography, and 1 in 10 (9%) reported either being a victim of it, knowing a victim, or both.
The ESET survey sheds new light on the rising problem of deepfake pornography, as recently highlighted by explicit deepfakes of Taylor Swift being viewed millions of times. The democratisation of deepfake technology has popularised a new form of image-based sexual abuse in the UK, with at least 60% of all revenge pornography victims being women (UK Council for Internet Safety).
This follows the passage of the Online Safety Act, in which creating or inciting the creation of deepfake pornography became a criminal offence. However, the ESET survey reveals that this has not done much to alleviate fears around the technology, with 61% of women reporting concern about being a victim of it, in comparison to less than half (45%) of men.
Deepfakes on the rise
ESET's research has found that two in five (39%) individuals believe that deepfake pornography is a significant risk of sending intimate content, yet a third (34%) of adults have still sent them. Of those that do, ESET's research reveals that 58% regret* sharing them.
The percentage of people sending intimate images or videos drops to 12% in the under-18s, perhaps due to the fact that 57% of teenagers surveyed are concerned about being a victim of deepfake pornography.
Despite interest in deepfake technology soaring over the past year, people are still taking risks, with just under one-third (31%) admitting to sharing intimate images with their faces visible. Concerningly, the research also found that the average age at which someone receives their first sexual image is 14.
"These figures are deeply worrying as they show that people's online habits haven't adjusted to deepfakes yet. Digital images are nearly impossible to truly delete, and it is easier than ever to artificially generate pornography with somebody's face on it," said Jake Moore, Global Cybersecurity Advisor, ESET. "Women are disproportionately targeted more often by abusers looking to exploit intimate images, and the prevalence of deepfake technology removes the need for women to take the intimate images themselves. We're urging the government to look beyond the Online Safety Act and address this critical risk to women's safety and security."
Women are more likely to be exploited
As part of this research to assess the prevalence and impact of deepfakes, ESET also asked about previous experiences with sharing intimate images online. Shockingly, a third (33%) of all women surveyed reported that explicit images shared have been misused. Of these, a quarter (25%) were threatened with posting these images, and 28% have had their photos posted publicly without permission.
Although progress in tackling this issue has been made with the Online Safety Act, women are still reluctant to seek help, with just 28% saying they would go to the police if someone misused their images online.
ESET's research also found:
- There is a widespread misunderstanding of the law around sexting, with 44% of respondents surveyed mistakenly believing it is legal to incite or encourage someone to send sexual images if they themselves are under 18.
- Of those who have had their intimate images misused, just under half (46%) reported feeling embarrassment or shame. 19% of those who share explicit images would not contact anyone for support.
- Three in 10 women have received unwanted intimate photos or videos.
- WhatsApp (37%) was the most used platform for sharing intimate images , with Snapchat (30%) the second most used. This is despite the fact that nine in 10 (89%) people are aware that messages could be screenshotted.
- Nearly half of all women (48%) who send explicit images do so under the age of 16, rising to 71% being under the age of 18.
Advice:
- The risk of your likeness being used in deepfake pornography is directly linked to how easy it is to find images of your face. Turning social media accounts to private whilst being careful about who you let follow you is the best way to reduce the likelihood of being targeted.
- Although the issue of online sexting may feel quite alien to everyday conversations, don't avoid speaking to your peers on the issue. The same power dynamics of manipulation, bullying and pressure still ring true. Talk to those around you about how to feel comfortable pushing back, and talk about boundaries and healthy ways of showing affection.
- A lot of this activity can be impulsive, so encouraging a healthy response of stopping, pausing, and thinking before acting can help you minimise rushed decisions.
- If your likeness is used to create deepfake pornography, you should report instances of it on social media. However, the situation can only be resolved with the help of law enforcement.
- If someone sends them an intimate photo and wants something in return, people should remember that this could be a trap. Legally, no one has the right to blackmail others or threaten that they will publish personal photos.
- If you do want to share intimate photos of yourself, you should ensure that what you share is anonymised. The content shouldn't show your face or identifying marks like tattoos or birthmarks, and the background of the picture should not reveal anything that could be used to track location. All of this can lead to easy identification and possibly be misused for blackmailing.
Methodology:
The research was conducted by Censuswide, among a sample of 2,004 respondents in total (1,002 12–17-year-olds and 1,002 18+). The data was collected between 7th December 2023- 12th December 2023. Censuswide abides by and employs members of the Market Research Society and follows the MRS code of conduct and ESOMAR principles. Censuswide is also a member of the British Polling Council.
Regret*= ‘Yes, I would never send an intimate photo or video again’ and ‘Yes, but I would send an intimate photo or video again’ combined.