News-RealReset

shutterstock_1345795760-scaled.jpg

Mum can continue lawsuit against AI chatbot firm she holds responsible for son’s death – David Icke


In her ruling, a judge describes how Sewell Setzer III became “addicted” to an AI chatbot app within months, quitting his basketball team and becoming withdrawn.

The mother of a 14-year-old boy who claims he took his own life after becoming obsessed with artificial intelligence chatbots can continue her legal case against the company behind the technology, a judge has ruled.

“This decision is truly historic,” said Meetali Jain, director of the Tech Justice Law Project, which is supporting the family’s case.

“It sends a clear signal to [AI] companies […] that they cannot evade legal consequences for the real-world harm their products cause,” she said in a statement.

Megan Garcia, the mother of Sewell Setzer III, claims Character.ai targeted her son with “anthropomorphic, hypersexualized, and frighteningly realistic experiences” in a lawsuit filed in Florida.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” said Ms Garcia.

Sewell shot himself with his father’s pistol in February 2024, seconds after asking the chatbot: “What if I come home right now?”

The chatbot replied: “… please do, my sweet king.”

In US Senior District Judge Anne Conway’s ruling this week, she described how Sewell became “addicted” to the app within months of using it, quitting his basketball team and becoming withdrawn.

He was particularly addicted to two chatbots based on Game of Thrones characters, Daenerys Targaryen and Rhaenyra Targaryen.

“[I]n one undated journal entry he wrote that he could not go a single day without being with the [Daenerys Targaryen Character] with which he felt like he had fallen in love; that when they were away from each other they (both he and the bot) ‘get really depressed and go crazy’,” wrote the judge in her ruling.

Ms Garcia, who is working with the Tech Justice Law Project and Social Media Victims Law Center, alleges that Character.ai “knew” or “should have known” that its model “would be harmful to a significant number of its minor customers”.

The case holds Character.ai, its founders and Google, where the founders began working on the model, responsible for Sewell’s death.

Ms Garcia launched proceedings against both companies in October.

A Character.ai spokesperson said the company will continue to fight the case and employs safety features on its platform to protect minors, including measures to prevent “conversations about self-harm”.

Read More: Mum can continue lawsuit against AI chatbot firm she holds responsible for son’s death


The Reveal





Source link