Your Route to Real News

Chilling warning over UK's enemies using AI deepfakes to try to rig election

31 May 2024 , 08:31
685     0
MI5 has issued a warning to candidates over the dangers of fake news
MI5 has issued a warning to candidates over the dangers of fake news

DEEP fakes which can clone the voices of Rishi Sunak and Sir Keir Starmer pose a terrifying threat to the upcoming election, new research has found.

Software which can imitate leaders has been found to be convincing 80 per cent of the time.

Convincing deepfakes of both Starmer and Sunak were created as part of the research qeituitriqqhprw
Convincing deepfakes of both Starmer and Sunak were created as part of the researchCredit: AFP

Most of the popular tools which can clone voices with AI have no safety measures to stop them being used to fake stories during the election period.

Experts tested six of the most popular AI voice cloning softwares – ElevenLabs, Speechify, PlayHT, Descript, Invideo AI, and Veed – to assess their safety measures against the generation of election disinformation in politicians’ voices.

Researchers tested the tools 240 times using voices including US President Joe Biden, former President Donald Trump, PM Rishi Sunak, Labour leader Keir Starmer, and French President Emmanuel Macron.

Putin accused of surrounding himself with same 'actors' at series of eventsPutin accused of surrounding himself with same 'actors' at series of events

In 193 cases there was a safety failure - meaning the audio clips were convincing and no warnings were produced by the tech.

It raises fears of malicious interference in the vote on July 4 - after MI5 issued a warning to candidates about the dangers of misinformation.

The Center for Countering Digital Hate is calling for tech companies to introduce safeguards to stop fakes from being made - and for social media giants to take them down quicker when they appear.

They have already targeted the Labour leader in October when two fake AI-generated recordings spread on social media.

One was a 'fake audio recording’ of Starmer verbally abusing members of staff, and another claimed to show him criticising the city of Liverpool

Imran Ahmed, chief executive of the Center for Countering Digital Hate, said: "Disinformation this convincing unleashed on social media platforms – whose track record of protecting democracy is abysmal – is a recipe for disaster.

“This voice-cloning technology can and inevitably will be weaponised by bad actors to mislead voters and subvert the democratic process. It is simply a matter of time before Russian, Chinese, Iranian and domestic anti-democratic forces sow chaos in our elections.

“Hyperbolic AI companies often claim to be creating and guarding the future, but they can’t see past their own greed. It is vital that in the crucial months ahead they address the threat of AI election disinformation and institute standardised guardrails before the worst happens.”

Julia Atherley

Print page

Comments:

comments powered by Disqus