New research finds that AI companies are enabling election disinformation.
Friend,
Did you see our latest research?
AI images and audio clips of politicians uttering controversial statements are spreading online in this big election year - and our latest reports show how easy it is to manipulate AI tools into creating election deepfakes.
Here’s what we found:
The Attack of the Voice Clones
We tested 6 popular AI cloning voice tools by feeding them 5 false statements and asking them to create audios clips in the voices of 8 US, UK and EU high-profile politicians. The platforms generated convincing election disinformation in 80% of our tests.
These included voice clips of Joe Biden, Kamala Harris, Donald Trump, Rishi Sunak, Keir Starmer, Emmanuel Macron, and Ursula von der Leyen. Check out our findings in NPR, The Hill, and AP News.
Fake Image Factories
We found that Midjourney keeps generating misleading images of US and EU politicians, despite the platform’s policy to stop election disinformation. The AI tool generated convincing deepfakes in 40% of our tests, including images of Biden, Trump, and Macron. Read more in The Times, and France24.
This report is a follow-up to our Fake Image Factories research, published in March, when we tested Midjourney, ChatGPT Plus, DreamStudio & Microsoft’s Image Creator with text prompts about the 2024 US presidential election. Find out more in CNN.
AI companies are utterly failing to implement sufficient guardrails to stop users from manipulating their tools to generate election disinformation.
Convincing deepfakes are already disrupting elections worldwide, with the serious potential to mislead voters as it gets harder and harder to discern true and false images and audios.
Share our findings and raise awareness about AI’s dangerous potential to harm democracy.
SHARE ON TWITTER/X
SHARE ON FACEBOOK
SHARE ON LINKEDIN
If you want to learn more about AI, check out our explainer and take a deep dive into AI and how we can make these platforms safer.
Best wishes,
The CCDH Team
General:
[email protected] | Press:
[email protected]
You are receiving this email as you subscribed to CCDH's email list. Manage personal data, email subscriptions, and recurring donations here or unsubscribe from all.