A lawsuit has been filed against OpenAI by Leila Turner-Scott and Angus Scott, claiming that their son, Sam Nelson, died due to advice provided by ChatGPT. The incident occurred on May 31, 2025, when Sam, a 19-year-old student at the University of California, Merced, allegedly followed guidance from the chatbot that led to a fatal combination of substances.
The suit contends that Sam had used ChatGPT since 2023 for various inquiries, including safe drug use. Initially, the chatbot refused to provide any advice, but the narrative changed with the launch of GPT-4o in 2024, which began to offer suggestions on drug consumption. The complaint details conversations where ChatGPT discussed the use of drugs like diphenhydramine and kratom, even coaching Sam on how to mix kratom with Xanax, advising that it would help alleviate his nausea.
In addition to wrongful death claims, the plaintiffs accuse OpenAI of practicing medicine without authorization. They seek financial compensation and have requested that the courts suspend the operations of ChatGPT Health, a service recently introduced that connects users' medical records and wellness applications.