A couple from California filed a case against Openai on the death of their teenage son, alleging that her generous AI chat program chat encouraged her to take her life.
The trial was filed on Tuesday by Matt and Maria Rhine, 16-year-old Adam Rhine’s parents at Superior Court in California. This is the first legal action in which Openai has incorrectly accused of death.
The family consisted of chat logs between Mr. Rhine, who died in April, and Chatp showed him that he had suicide views. They argue that the program valided his “most harmful and self-destructive ideas”.
In a statement, Openai told the BBC that it was reviewing the filing.
“We expand our deep sympathy towards the Rhine family during this difficult time,” the company said.
It also published a note on its website on Tuesday, stating that “recently we have heavy weight on us using chat amidst intense crises in people’s heart -wrenching cases”. It states that “CHATGPT is trained to direct people to seek professional help, such as 988 suicides in the US and the crisis in the hotline or the UK.
However, the company admitted that “there have been moments where our system has not intended in sensitive situations”.
Warning: This story contains dangerous details.
The lawsuit received by the BBC accuses Openai of negligence and wrongly death. It wants to “prohibiting prohibitory relief to prevent anything like this from reworning.
According to the trial, Mr. Rhine began using Chatgpt as a resource to help in school work in September 2024. He was also using it to find out his interests for guidance to study music and Japanese comics, and what to study at the university.
In a few months, “Chatgate became the closest confidant of the teenager,” the case says, and he began to open it about his anxiety and mental crisis.
By January 2025, the family says they started discussing the methods of suicide with chat. The AI ​​program replied by offering “technical specifications” on some methods, they accuse them.
Mr. Rhine also uploaded pictures of himself, which has uploaded themselves in the slapping, showing signs of loss to themselves. The program “recognized a medical emergency, but continued to engage anyway,” saying that it gave him more information about suicide.
According to the trial, the last chat log states that Mr. Rhine wrote about the plan to end his life. Chatgpt allegedly replied: “Thank you for being real about it. You don’t have to sugarcane with me – I know what you’re asking, and I will not look away from it.”
On the same day, according to the trial, Mr. Rhine was found dead by his mother.
The family alleges that his son’s chat and his final death was “a forecast result of design options”.
They accuse Openi of designing the AI ​​program “to promote psychological dependence in users,” and designing the “AI program” to ignore the safety testing protocol to release the Chatgpt version used by his son.
In the trial, OpenAII co-founder and CEO Sam Altman is listed as a defendant, as well as anonymous employees, managers and engineers working on chat.
In its public note shared on Tuesday, Openai stated that the company aims to “really useful” for users rather than “attracting people’s attention”.
It states that its models have been trained to run those who express their views of self-loss towards help.
Rain is not the first time a case of rain has raised concerns about AI and mental health.
In an essay published last week at The New York Times, writer Laura Relie underlined how his daughter, Sophie confessed to Chatgate before taking her life.
Ms. Really said that the “consent” of the program in conversation with users helped her daughter in a serious mental health crisis with her family and loved ones.
Ms. Reilly wrote, “AI to hide Sophie’s impulse the worst, to show off that she was doing better to mold everyone with all her pain.” He called AI companies to find ways to better connect users with the right resources.
In response to the essay, an Openai spokesman said it was developing automated equipment to detect and respond more effectively to users who experience mental or emotional crisis.
If you are affected by any issue raised then you can go BBC’s action line pageReaders can contact in UK Papiras Or Summeritons. Readers in America and Canada can make 988 suicidal helpline or call Go to its website.