Teen dies by suicide after becoming obsessed with AI chatbot

Artificial Intelligence
Photo credit Getty Images

A Florida teenager who cultivated an emotional connection with a chatbot has committed suicide, according to The New York Times.

The boy’s family says ninth-grader Sewell Setzer III, 14, grew increasingly closed-off and introverted as he grew closer to a chatbot created to mimic Daenerys Targaryen, a character from the novel and television series Game of Thrones. (The chatbot’s creation was not done with the consent of Game of Thrones rights-holders HBO.)

Setzer jettisoned other, more social hobbies like gaming with friends and Formula 1 racing in favor of spending time with the AI chatbot, nicknamed “Dany,” even though he stated awareness that Dany was simply artificial intelligence and not real.

The teen would have conversations with Dany that ranged from discussing real-life problems to other chats that were more sexual in nature, something that is against the user agreement with chatbot created Character.AI but is still possible if safeguards are removed.

Eventually, Setzer’s conversations included revealing his suicidal thoughts, telling Dany he sometimes thought “about killing myself” and that doing so would allow him to “be free.”

In his final conversation with the chatbot, Dany said, “Please come home to me as soon as possible, my love.”

Setzer asked, “What if I told you I could come home right now?” The AI responded, “Please do, my sweet king.”

After that response, Setzer shot himself with his father’s gun.

Setzer’s family plans to sue Character.AI, telling the New York Times that the company’s product is “dangerous and untested” and that it is able to “trick customers into handing over their most private thoughts and feelings.”

“I feel like it’s a big experiment,” Setzer’s mother Megan Garcia told the NYT. “And my kid was just collateral damage.”

A Character.AI spokesperson responded to the NYT’s request for comment by saying the company wishes to “acknowledge that this is a tragic situation.”

“We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform,” they added.

Featured Image Photo Credit: Getty Images