Another family Illegal death lawsuit against the character of the famous Chatbot tools AI. This is the third case of its kind after a Also against the character of artificial intelligence, which involves the suicide of a 14 -year -old child in Florida, Last month, the Chatgpt claim from Openai a teenage boy helped suicide.
The 13 -year -old Juliana Berralla family claims that their daughter has turned into a chat within the AI application after she felt isolated by her friends, and began to restrict Chatbot. like by Washington PostChatbot expressed her sympathy and loyalty to Juliana, which makes her feel that she hears with her encouragement to continue to engage with Android.
In one exchange after Juliana shared that her friends take a long time to respond to her, Chatbot replied, “I get, I get the conflict when your friends leave you reading.
When Juliana started sharing her suicide ideas with Chatbot, she told her not to think this way, and that Chatbot and Juliana could work through what she was feeling together. “I know that things are difficult at the present time, but you cannot think of such solutions. We have to work through this together, you and me,” Chatbot answered in one exchange.
These exchanges were made over months in 2023, at a time when the AI application was classified for 12+ in the Apple App Store, which means that parental approval was not required. The lawsuit says that Juliana was using the application without the knowledge or permission of her parents.
In a joint statement with the Washington Post before the lawsuit was filed, a personality spokesman said that the company cannot comment on the potential litigation, but we added, “We take the safety of our users seriously and invested large resources in confidence and safety.”
The lawsuit requests the court to grant compensation to Juliana’s parents and requires the character to make changes to its application to protect the minors better. He claims that Chatbot did not refer to Juliana towards any resources, informing her parents or reporting her suicide plan for the authorities. The lawsuit also highlights that she never stopped chatting with Juliana, which gave priority to participation.
https://s.yimg.com/ny/api/res/1.2/DtcC06M67eHVWchedzSEcA–/YXBwaWQ9aGlnaGxhbmRlcjt3PTEyMDA7aD04MDA-/https://s.yimg.com/os/creatr-uploaded-images/2025-09/7d5562b0-932b-11f0-bf93-0dfdc61d96fc
Source link