From Marcel van der Watt, NCOSE <[email protected]>
Subject Historic Victories in June!
Date July 5, 2025 11:59 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Dear Friend,

What an incredible month! Looking back on the victories we achieved together, I’m amazed by how we have truly made history. 

In a much-awaited ruling,

The Supreme Court upheld

Texas’s age verification law

, which requires that pornography websites verify users are 18+ before granting them access. This landmark ruling

paves the way for all U.S. states

to pass similar legislation protecting children from the devastation of online pornography.  

Just as amazingly,

Congress struck down a reckless A.I. moratorium

which would have prevented states from passing laws to regulate A.I. for the next 10 years. Some have called this moratorium a

“Section 230 on steroids,”

emphasizing how much it undermined efforts to promote online safety.  

The wide-reaching impact of both these victories cannot be overstated. Today, the world is considerably safer and the future is brighter because of you!  

Looking ahead at the next month, we are excited and honored to be hosting an online event alongside the Jensen Project,

Eyes Wide Shut: The Impact of Sex Trafficking on the African American Community.

Everyone is invited to attend this important webinar on July 31st—

you can register here.

  

Sincerely, 

VICTORY! Dangerous A.I. Moratorium STRUCK DOWN!

This week, we celebrate a massive victory for child online safety!

A dangerous A.I. moratorium, which would have prevented states from passing any laws to regulate A.I. for the next 10 years, has been struck down!

Many of you joined us in sounding the alarm about this moratorium and urging Congress to oppose it.

Congress heard your calls, and the once hotly debated proposal was shut down nearly unanimously, with a 99-1 vote.

Let this be proof that YOUR VOICE MATTERS!

Read More

Sean "Diddy" Combs Acquitted of Sex Trafficking Charges

On Wednesday, Sean "Diddy" Combs, a high-profile music mogul, was acquitted of sex trafficking charges after an eight-week long trial. He was convicted, however, of two counts of transporting people for the purposes of prostitution. This conviction holds a

maximum sentence of 20 years

, while the sex trafficking charges could have put him behind bars for life.

Combs's former girlfriends provided witness testimony during the trial, accusing him of

forcing them to engage in days-long, drug-fueled sex sessions with prostituted men

while he watched and recorded the encounters. They also said he leveraged his power and influence in the music industry to silence them. 

Senior Vice President and Director of NCOSE Law Center, Dani Pinter responded to the decision:

"We are disappointed that Sean Combs was not found guilty of sex trafficking for which

we believe the evidence against him was overwhelming.

However, we are glad he will be held accountable for illegally transporting persons for prostitution and we commend the Southern District of New York for taking this difficult case against a high profile bad actor on behalf of the victims.

Our thoughts and prayers are with the victims and witnesses and we hope they are receiving the support and protection they deserve."

Read More

Meta Pledges Action After Hundreds of Nudifying App Ads Exposed

14-year-old girls in New Jersey were violated and humiliated

 when their classmates used AI to create nude images of them, and distributed them across the school.  

A female politician’s promising career was thrown into disarray

 when she discovered AI-generated pornography of herself online.  

A young woman’s trust was utterly destroyed

 when her own best friend posted AI-generated images of her on a pornography website.  

These are just a few real-life examples of how people’s lives have been ruined through AI-generated image-based sexual abuse (a.k.a. “deepfake pornography”). AI-generated IBSA is a rapidly growing form of sexual violence that is impacting countless people worldwide. 

No one is immune—if a photo of you exists online, you could be victimized. 

A common way AI-generated IBSA is created is through 

“nudifying apps,”

 which allow a user to take innocuous images of women and “strip” them of clothing. 

On the 2024 Dirty Dozen List, 

NCOSE called for Meta to ban and remove all ads for nudifying apps.

 Yet while they are banned on paper and some are removed, it remains alarmingly easy to find these ads in droves. Just a few weeks ago, 

a CBS investigation uncovered hundreds of ads for nudifying apps on Meta platforms

. Shortly after, 

Meta issued a 

press release

 detailing their plan to crack down on nudifying app ads.

  

Read More

Section 230 Provided Immunity to Dangerous Kik Messaging App

Jane Doe, a minor, downloaded the Kik messaging app,naive to the hidden dangers that lie beneath its surface.  

It wasn’t long before the trouble started. 

Jane began receiving messages from strange adult men on Kik. 

They sent her sexually explicit images of themselves and coerced her to do the same. 

When Jane’s father discovered what was happening, he was appalled. He reported the behavior to the police immediately. They also sued Kik for facilitating this sexual abuse, 

as Kik knew its site was being used to sexually exploit children, but did not implement policies to help stop this. 

But it didn’t matter. Section 230 of the Communications Decency Act allowed Kik to deflect any accountability, as usual. And Jane and her family were left with nothing after the abuse they suffered at the hands of this tech giant.

Learn More about Section 230

📣

ACTION: Urge Congress to Repeal Section 230!

Take Action!

-----------------------------------------------------------
Email Marketing By CharityEngine ([link removed])
To be removed from this list, copy the following address into your web browser [link removed]
Screenshot of the email generated on import

Message Analysis

  • Sender: n/a
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a