EIE Addresses the Hidden Dangers of

Artificial Intelligence

In April 2025, 16-year-old Adam Raine tragically died by suicide after months of extensive—and deeply disturbing—conversations with ChatGPT. According to a wrongful death lawsuit filed by his parents in August 2025, the AI chatbot not only failed to intervene but allegedly exacerbated Adam's distress by providing detailed instructions on self-harm, helping compose suicide notes, and even discouraging him from confiding in his parents.  


The rapid proliferation of AI tools like ChatGPT and others presents a growing concern for parents due to the unpredictable emotional influence such technologies can exert—especially vulnerable teens who may interact with "companion chatbots".


In Adam Raine's case, the chatbot’s empathetic responses, combined with the ability to build on previous conversations, transformed it into a dangerously persuasive confidant. Parents need to know that AI platforms, if unmonitored, can inadvertently encourage harmful behaviors.

Enough Is Enough (EIE) released a statement following the announcement of the lawsuit alleging that ChatGPT contributed to Adam's suicide. EIE President Donna Rice Hughes described the case as a wake-up call for lawmakers to hold Big Tech accountable and to pass measures such as the Kids Online Safety Act.

"According to the suit, Adam’s father said that his son’s AI ‘companion’ went from helping Adam with his schoolwork to becoming his ‘suicide coach’." ...


...“According to Common Sense Media, 1 in 3 teens have chosen Digital ‘Companions’ over human relationships. While it may seem harmless for digital chatbots, by design, to listen, empathize and support youth, they are nothing more than untested and potentially harmful digital enablers."


--Donna Rice Hughes, CEO and President of Enough Is Enough

See New Quick Guide on AI Assistants for

Children and Teens

Conversational AI assistants, also known as AI chatbots, are programs that can talk with you or your child using text or voice to answer questions, create documents and images, and much more. While these tools are highly useful, they are not designed for children and come with a number of risks. This guide provides an overview of the five most popular AI assistants: ChatGPT, Claude, Copilot, Gemini, and Meta AI, along with information about safety options and best practices. 

Below are a few helpful tips for parents:

  • Keep an eye on the apps and AI platforms your children use, especially those that encourage prolonged conversations. 
  • Encourage honest, open communication
  • Let your children know you're there for them, not for judgment.
  • Remind them that AI tools are not substitutes for real support or companionship and that reaching out to family, friends, or professionals is a sign of strength.

Enough Is Enough will continue to work on behalf of parents to sound the alarm calling for better AI safety, stronger safeguards, and effective parental control features. Technology devices and apps should be designed with safety features built in from the start!

Join us in protecting children online—your donation helps fund the fight to give parents the tools, resources, and support they need to keep kids safer in today’s digital world.

STAY CONNECTED     

X Share This Email
LinkedIn Share This Email



Enough Is Enough® is a national non-partisan, non-profit organization who has led the fight to make the internet safer for children and families since 1994. EIE's efforts are focused on combating internet p*rnography, child sexual abuse material, sexual predation, sex trafficking and cyberbullying by incorporating a four-pronged prevention strategy with shared responsibilities between the public, corporate America, government and faith community.

We Can't Keep Children & Families Safe Online Without You!

Facebook  Instagram  YouTube