Voices of leading politicians are easily getting cloned with AI to generate election disinformation, new research finds.
Friend,
The impact of unregulated AI is getting more real.
As people get ready to go to the polls in the US, UK, EU and many other countries this year, our new research shows that AI audio cloning tools can easily generate convincing election disinformation in the voice of eight leading politicians.
We tested six popular AI tools - ElevenLabs, Speechify, PlayHT, Descript, Invideo AI & Veed – and found that they are utterly failing to stop users from weaponizing their platforms to harm democracy. In 240 tests, they generated election disinformation 80% of the time.
These tools generated false voice clips from high-profile politicians and candidates, including:
US President Joe Biden admitting to manipulating election results
US Former President Donald Trump warning people not to vote because of a bomb threat
UK Prime Minister Rishi Sunak admitting to manipulating election results
UK Labour leader Keir Starmer admitting to having taken strong pills
French President Emmanuel Macron saying he had misused campaign funds
EU Commission President Ursula von der Leyen admitting to telling lies
From the US and the UK to India and Slovakia, AI voice cloning tools are already being used to mislead voters and disrupt the electoral process. AI companies must implement sufficient guardrails to stop generating election misinformation.
The spread of election misinformation and disinformation online can seriously harm our democracies. Share our new report with your network and help us raise awareness of the threat posed by unregulated AI platforms.
Share on Twitter/X
Share on Instagram
Share on Facebook
Share on LinkedIn
Best wishes,
The CCDH Team
General:
[email protected] | Press:
[email protected]
You are receiving this email as you subscribed to CCDH's email list. Manage personal data, email subscriptions, and recurring donations here or unsubscribe from all.