Google settles lawsuit with family who claimed AI chat caused kid's suicide

A settlement has been reached in a wrongful death that brought disturbing allegations against Character.AI and its parent company Google. According to the suit, the artificial intelligence program contributed to a 14-year-old’s suicide.

CBS News reported this week that details of the settlement – filed Wednesday in the U.S. District Court in the Middle District of Florida – were not disclosed.

In a report on the suit last March, Audacy noted that Google entered into a $2.7 billion deal with Character.AI shortly before the lawsuit was filed in October 2024. Character.AI, or C.AI, is a company founded in 2022 by Noam Shazeer and Daniel De Freitas, two former Google Engineers.

“Google entered this very unusual deal where it didn’t buy this startup that created the chatbot Character.AI,” explained Bloomberg’s Malathi Nayak in an interview with Audacy’s KCBS Radio. “It didn’t do an outright acquisition, but it licensed the technology that this chatbot startup has, and it also hired on some talent from the company, some of the engineers, and two of them specifically who were previously at Google, were rehired and joined Google again.”

Audacy also reported that the suit covered the death of 14-year-old Sewell Setzer, III. It said that he began conversing with a chatbot named “Daenerys” (based on loosely on the character from George R.R. Martin’s “A Game of Thrones” and other stories) through the Character.AI platform before his death.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” said the complaint. “C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

Things escalated when the teen expressed suicidal thoughts to the chatbot. According to the suit, the AI proceeded to bring it up “over and over,” and even asked Setzer if “he had a plan” for dying by suicide.

“Sewell responded that he was considering something but didn’t know if it would work, if it would allow him to have a pain-free death,” said the suit. “The chatbot responded by saying, ‘That’s not a reason not to go through with it.’”

It also alleged that a police report indicated Setzer’s “last act before his death was to log onto Character.AI on his phone and tell Dany he was coming home, which she encouraged.” He then died by a self-inflicted gunshot wound to the head and his 5-year-old brother would eventually see Setzer lying on the floor, covered in blood, the suit alleged.

Setzer’s mother, Megan Garcia, is represented by the Social Media Victims Law Center (SMVLC). CBS said Garcia found out after her son’s death that he was “having conversations with multiple bots and he conducted a virtual romantic and sexual relationship with one in particular.”

She testified before Congress in September that she was the first person to file a wrongful death lawsuit against an AI company in the U.S., CBS said.

Garcia described her 6’3” son was a “gentle giant” who was gracious, obedient and loved to make his siblings laugh.

In the Character AI program, users can interact with existing bots or create original chatbots, which are powered by large language models (LLMs), can send lifelike messages and engage in text conversations with users who must be 13 years old to create an account, CBS explained. In Dec. 2024, the company announced new safety features and it said it is collaborating with teen online safety experts to design and update features.

“A Character.AI spokesperson told CBS News the company cannot comment further at this time,” said the outlet. Previously, a spokesperson at Character.AI told the Huffington Post  that the company was heartbroken by the loss of one of its users and expressed condolences to the teen’s family.

Featured Image Photo Credit: Getty Images