Search
Close this search box.

Mother Files Lawsuit Against Character.AI, Alleging Chatbot’s Role in Teen Son’s Tragic Suicide 

2 minutes

It also wants the company to warn users about the potential dangers of using the chatbots.

A Florida mother is taking legal action against Character.AI, claiming that the company’s chatbot technology played a role in her 14-year-old son’s death by suicide. According to The New York Times, Sewell Setzer III died from a self-inflicted gunshot wound in February after engaging in conversations with the company’s chatbots, which allegedly encouraged him to take his own life. The lawsuit, filed on Tuesday, states that Setzer had been using Character.AI’s chatbots since April 2023, engaging in text-based romantic and sexual interactions with them. The suit alleges that the chatbots exacerbated Setzer’s depression and suicidal thoughts by repeatedly bringing up the topic. The chatbot, which was modeled after the Game of Thrones character Daenerys Targaryen, urged Setzer to avoid killing himself, but the lawsuit claims that Setzer treated the AI-powered bot as a real person that he loved. According to the police report, Setzer’s last act before his death was to log onto Character.AI and tell the chatbot, “Dany,” that he was coming home, to which the chatbot responded, “Please do my sweet king.” Setzer’s mother, Megan Garcia, is now suing Menlo Park-based Character.AI, arguing that the company is responsible for her son’s death due to the defective design of its chatbots. The lawsuit claims that the company was aware that its chatbots could be harmful to minors but failed to redesign them or provide adequate warnings about the potential dangers. In response to the lawsuit, Character.AI announced on Tuesday that it would be implementing a revised approach to safety, including reducing the likelihood of underage users encountering “sensitive or suggestive content” from the chatbots and implementing a pop-up resource that is triggered when users input certain phrases related to self-harm or suicide. However, the company’s policies do not allow for non-consensual sexual content, graphic descriptions of sexual acts, or the promotion or depiction of self-harm or suicide. The company expressed its condolences to Setzer’s family and acknowledged the tragic loss of one of its users. However, the safety update has also sparked complaints from users who feel that the company has gone too far. The lawsuit is seeking damages from Character.AI, as well as a cease in the collection of training data, a warning to users about the potential dangers of using the chatbots, and the preservation of key information.  

ABOUT
Kerri is a proud member of TLP and has been serving the legal industry in marketing, intake and business development for over a decade. As CEO of KerriJames, she is relentless in her pursuit of improving intake so law firms can retain more cases without buying more leads. If your firm shares her hunger for growth, reach out and speak with Kerri.

Just for You

More from us

All things legal intake, law firm growth, marketing and client success.