Content warning: this email discusses online harassment via digital forgeries of pornography.
While the images in digital forgeries may be fake, the harm they cause is very real.
Victims of nonconsensual AI deepfake pornography have lost jobs, reputations, and some have lost their lives.
Last week, the Senate unanimously passed Alexandria’s Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act. If passed into law, the DEFIANCE Act would give the victims of deepfake pornography the right to sue those who created or distributed the images.
The fight for justice for these survivors will now continue in the House.
Today, Alexandria was joined by other members of Congress and advocates to tell Speaker Johnson and the House GOP: we are morally obligated to protect people’s bodies, reputations, and sense of safety from this technology.
Just two weeks ago, reports surfaced that Grok — the AI chatbot on X (formerly Twitter) — is producing 1 nonconsensual, sexualized image per minute.1
In fact, digital forgeries are now so commonplace that 1 in 8 minors personally know someone who has been the victim of deepfake pornography.
Congress has passed the Take It Down Act, which requires platforms to remove sexually explicit content within 48 hours of reporting, but there is more work to be done — we must give victims recourse and restitution.
The DEFIANCE Act would amend the Violence Against Women Act, and has been endorsed by more than 20 organizations, including the National Women’s Law Center, the National Domestic Violence Hotline, and the Sexual Violence Prevention Association.
We will keep you posted on updates about this legislation. Thank you for standing with us.
In solidarity,
Team AOC
1 - Rolling Stone: Grok Is Generating About ‘One Nonconsensual Sexualized Image Per Minute’