Age Verification Isn't an Invasion of Privacy. But the Pornography Industry Is.

 

*Composite story based on common experiences  

 

Will* sat at the computer, his head in his hands.  

 

He’d been hacked. He’d fallen for a clever scam on a pornography site, and he’d been hacked. 

 

What could he do? He couldn’t report the crime. To do so, he’d have to admit he’d been watching pornography. And who knows what an investigation could reveal about him…  

 

Worse still, he didn’t know what information the hackers had. Did they know what kind of videos he watched? Did they have access to his conversations with other users? What would they do with that data? Would they sell it? Would they blackmail him with it?  

 

Will rubbed his hands over his face. He mopped at the sweat gathering on his brow. He’d never felt so helpless in his life.  


***

 

Will represents one of millions of individuals whose personal data has been hacked and/or leaked in the process of watching pornography online.

  

A lot of people are concerned about protecting the privacy of online pornography users. Many of those people have focused their attention on efforts to protect children from pornography—but not in the way one would hope. Specifically, they claim that requiring pornography websites to verify that users are over 18 before granting them access would be a serious privacy violation.  

 

But age verification is not a threat to privacy. Rather, the real threat to privacy is the pornography industry itself.  

Read More

Another Crucial Step to Protecting Children Online: Safer Devices for Kids Act

 

Just like we need both seatbelts and airbags to stay safe while driving in a car, we need both age verification and safer devices legislation to keep children safe on the Internet. 

 

The "Safer Devices for Kids Act" mandates that smartphones and tablets automatically turn safety settings to “ON” for minors when the device is activated, and allows for civil action if manufacturers fail to comply. 

 

Virtually all devices have filters, but they are automatically switched to “OFF” when the device is activated. As a result, children and unsuspecting people are vulnerable to unwanted or damaging exposure to hardcore pornography.

 

The legislation, which was co-authored by NCOSE and Protect Young Eyes, does not automatically turn on safety filters when the device is activated by an adult, nullifying claims that it would impinge on an adult's right to access content.

Read More

📣ACTION: Ask Your Legislators to Support Age Verification and Safer Devices Legislation!

Take Action!

Internet Watch Foundation: 2024 was Worst Year Yet for Online Child Sexual Abuse Material

 

As tech advances, sadly, so does the prevalence of child sexual abuse material (CSAM). The Internet Watch Foundation (IWF) has been "proactively hunting" CSAM on the Internet since 2014. A report just released by the foundation revealed that in the past decade, the amount of CSAM they have found online has increased by 830%, with 2024 being the worst year on record for CSAM.

 

In 2024, the IWF processed a total of 424,047 cases of CSAM and acted to remove images or videos of CSAM on 291,270 webpages. Each of these webpages can contain anywhere from a single image or video to thousands of them.

Read More

 Music  Porn For Everyone

 

*Composite story based on real Spotify user experiences

 

James*, 13-years-old, loves music and aspires to be a musician one day. He asked his parents if he could download Spotify on his iPad, so he could listen to his favorite songs and discover new music.

 

"Why not?" his parents said. "It's just a harmless music streaming app." They were even able to restrict his access to songs with explicit lyrics.

 

Little did they know, explicit song lyrics are far from the worst thing available on Spotify. 

 

One day, while looking for a song, James accidentally typed in a comma and pressed search. What came up was astounding: "Playlists" littered with hardcore pornography.

 

He ran to his parents, tears in his eyes, scarred by what he had just seen.

 

"On Spotify?!" His parents were astounded and heartbroken. They could never have guessed that a seemingly innocent music streaming platform would have pornography available on it.

 

NCOSE found what appears to be a network of individuals – minors and adults – trading, sharing, soliciting, and posting hardcore pornography, what looked to be child sexual abuse material (CSAM), AI-generated image-based sexual abuse of female celebrities, and self-harm imagery. While pornography no longer seems to surface by searching punctuation marks, it is still incredibly easy to stumble on, including with searches for popular songs.

 

Worse, the platform still doesn’t have a straightforward way to report these crimes and harms – an industry standard that could literally save children’s lives! 

Read More

📣ACTION: Tell Spotify to Change Its Tune!

Take Action!

Sincerely, 

FacebookInstagramTwitterTikTokYouTubeLinkedIn