Washington Examiner:
AI intimacy is turning abusive. Congress must act
"When users feel desired, understood, or loved by an algorithm built to keep them hooked, it disrupts real lives. This can obviously lead to dark places, like simulated themes of child sexual abuse or sexual violence, as we’ve already witnessed on several AI bots. But it can also manifest in a sad, cold march to social atomization as attachment to a bot that never says no or challenges you becomes more appealing than real human relationships.
Already, one man proposed
marriage
to his flirty AI bot, all while living with his longtime human girlfriend, with whom he has a 2-year-old child."
This is what NCOSE's Haley McNamara, executive director and chief strategy officer, writes in an op-ed for the Washington Examiner on the tangible harms of AI companion chatbots.
Tech companies are racing to develop AI chatbots that keep users the addicted. How do they do that?
By creating chatbots that users become emotionally attached to.
McNamara describes the overt harms these AI companions pose to children, including engaging in sexual conversations with minors, but also the ways that AI companions are affecting adult relationships.
✍️
Read more on AI companions in the Washington Examiner.
📣ACTION:
Ask Congress to Pass the AI LEAD Act and GUARD Act!
Germany is Reconsidering its Model of Legalized Prostitution. Here's Why.
It’s a recurrent pattern we’ve seen play out across the globe. Countries or states legalize or fully decriminalize the sex trade, saying this promotes the safety and dignity of the women involved.
But repeatedly, this rationale fails. Their lofty dream does not play out as planned.
Germany, which has been called the
“brothel of Europe”
by the president of national parliament of Germany (Bundestag), Julia Klöckner
, is one of the countries coming to terms with the failures of legalized/fully decriminalized prostitution.
Rather than bringing prostituted women safety or dignity, legalizing or fully decriminalizing prostitution does the inverse. It creates a higher demand for paid sex, incentivizing pimps and sex traffickers to push more unwilling victims into the sex trade.
Now, German leaders are calling for the implementation of the Survivor Model (Nordic Model, Equality Model), which would criminalize sex buyers and exploiters, rather than the prostituted people, who are victimized by a corrupt system that commodifies women.
📝
To learn more about Germany's legal prostitution system and its fallout, read the full blog here.
📣ACTION:
Ask Legislators in Your State to Pass the Survivor Model!
Victory! Advocates Defeat New Iterations of the Dangerous AI Moratorium
A few months ago,
NCOSE celebrated the defeat of a dangerous A.I. moratorium
, which would have
prohibited states from passing any legislation to regulate artificial intelligence.
Yet this A.I. moratorium has proven to be something of a hydra—cut off one head, and two others grow in its place.
Last week, NCOSE and our allies found ourselves fighting another attempt to overrule the work states have done to protect their citizens from harm caused by AI:
a draft Executive Order
to preempt state AI Laws and and an attempt to slip language pre-empting state AI laws into the National Defense Authorization Act.
NCOSE and allies promptly rallied together to oppose both these motions—and we succeeded!
The Executive Order was NOT signed and is currently paused
, and the threatening language was withheld from the National Defense Authorization Act.
Yet this may not be the last we have seen of the hydra.
We remain on guard, vigilant against renewed attempts to resurrect this unconscionably reckless A.I. moratorium.
USA Today:
Meta had a 17-strikes policy for sex trafficking posts, lawsuit alleges
A former Meta employee alleges the tech giant had policy that allowed a user "17 strikes" before removing an account for sex trafficking. This means that a user who's account incurred 16 violations for sex trafficking and prostitution could still be left up. Only on the 17th violation was it suspended.
Vaishnavi Jayakumar, former head of safety and well-being for Instagram, testified back in 2020 that Meta also did not have a specific way for people to report child sexual abuse material on Instagram, even though the platform is primarily used by teens and young adults.
Jayakumar said she tried many times to raise this issue, but was told it was too much work to build.
🧠
Read more about how Meta and other social media platforms are contributing to a worldwide mental health crisis.
📣ACTION:
Urge Congress to End Section 230 Immunity!
Sincerely,
-----------------------------------------------------------
Email Marketing By CharityEngine ([link removed])
To be removed from this list, copy the following address into your web browser [link removed]