News presenter Cathy Newman is among the hundreds of British celebrities who have become victims of deepfake pornography.
The Channel 4 News host felt violated as she watched digitally altered footage of her face superimposed on to pornography using artificial intelligence (AI). She took part in a Channel 4 investigation, which aired on Thursday evening, where they did an analysis of the five most visited deepfake websites.
The shocking investigation found 255 of the almost 4,000 famous individuals listed were British and all but two were women. Cathy watched deepfake footage of herself for her reports. She said: “It feels like a violation. It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.
“You can’t unsee that. That’s something that I’ll keep returning to. And just the idea that thousands of women have been manipulated in this way, it feels like an absolutely gross intrusion and violation. It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”
Channel 4 News said they contacted over 40 celebrities for their investigation, and all were unwilling to comment publicly. They also found more than 70% of visitors to deepfake websites were using search engines like Google. Deepfake images of pop star Taylor Swift were posted to X, formerly Twitter, earlier this year. The platform blocked searches linked to the singer after fans lobbied the Elon Musk-owned platform to take action.
Amanda Owen insists there's 'no stress' as she opens up about 'separation'The Online Safety Act makes it a criminal offence to share, or threaten to share, a manufactured or deepfake intimate image or video of another person without his or her consent but it is not intended to criminalise the creation of such deepfake content. Channel 4 News claimed the most targeted individuals of deepfake pornography are women who are not in the public eye.
Tory MP Caroline Nokes, who is chairwoman of the Women And Equalities Committee, told Channel 4 News: “It’s horrific… this is women being targeted. We need to be protecting people from this sort of deepfake imagery that can destroy lives.” In a statement to the news channel, a Google spokesperson said: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.
“Under our policies, people can have pages that feature this content and include their likeness removed from Search. And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”
Ryan Daniels, from Meta, said in a statement to the broadcaster: “Meta strictly prohibits child nudity, content that sexualises children, and services offering AI-generated non-consensual nude images.” Elena Michael, a campaigner from the group NotYourPorn, told Channel 4 News: “Platforms are profiting off this kind of content. And not just porn companies, not just deepfake porn companies, social media sites as well. It pushes traffic to their site. It boosts advertising.”