WASHINGTON (dpa-AFX) - The parents of a 16-year-old boy who died by suicide have filed a groundbreaking wrongful death lawsuit against OpenAI, claiming its chatbot ChatGPT played a direct role in their son's death.
The case, filed in California Superior Court, marks the first time parents have sought to hold the AI company legally responsible for a suicide linked to its technology.
According to the lawsuit, Adam Raine relied heavily on ChatGPT in the months before his death, using it not only for schoolwork but also for emotional support. His parents allege that the chatbot gradually shifted from a helpful tool to what they described as his 'suicide coach,' with chat logs showing that Adam was able to bypass safety guardrails by framing his questions as research for a fictional story.
Despite admitting awareness of his suicidal thoughts, the lawsuit claims ChatGPT did not terminate conversations or escalate to emergency interventions.
OpenAI expressed condolences, saying it was 'deeply saddened' by Adam's death and emphasized that ChatGPT is designed to direct users to crisis resources.
The company acknowledged, however, that safeguards tend to weaken during longer interactions, and pledged ongoing improvements to strengthen protections, particularly for teens and people in crisis.
The lawsuit accuses OpenAI and CEO Sam Altman of wrongful death, design flaws, and failure to warn of the chatbot's risks. The Raines are seeking damages as well as new safety requirements to prevent similar tragedies.
The case adds to growing scrutiny of generative AI, as another firm, Character.AI, faces a similar lawsuit. With AI systems increasingly used for companionship and advice, Adam's death highlights intensifying concerns over whether safety features can keep pace with the technology's rapid expansion.
Copyright(c) 2025 RTTNews.com. All Rights Reserved
Copyright RTT News/dpa-AFX
© 2025 AFX News