Washington Examiner: AI Intimacy is Turning Abusive. Congress Must Act
"When users feel desired, understood, or loved by an algorithm built to keep them hooked, it disrupts real lives. This can obviously lead to dark places, like simulated themes of child sexual abuse or sexual violence, as we’ve already witnessed on several AI bots. But it can also manifest in a sad, cold march to social atomization as attachment to a bot that never says no or challenges you becomes more appealing than real human relationships. Already, one man proposed marriage to his flirty AI bot, all while living with his longtime human girlfriend, with whom he has a 2-year-old child."
This is what NCOSE's Haley McNamara, executive director and chief strategy officer, writes in an op-ed for the Washington Examiner on the tangible harms of AI companion chatbots.
Tech companies are racing to develop AI chatbots that keep users addicted. How do they do that? By creating chatbots that users become emotionally attached to.
McNamara describes the overt harms these AI companions pose to children, including engaging in sexual conversations with minors, but also the ways that AI companions are adversely affecting adult relationships.
✍️ Read more on AI companions in the Washington Examiner.
📣ACTION: Ask Congress to Pass the AI LEAD Act and GUARD Act!
The AI LEAD Act and the GUARD Act are bipartisan federal bills that propose complimentary solutions to the harms of AI.