Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother

“What if I told you I could come home right now?” – This was the last message Sevel Setzer III, a 14-year-old boy from Florida, wrote to his online friend Daenerys Targaryen, a lifelike AI chatbot named after a character from the fantasy show Game of Thrones . Soon thereafter he shot himself with his stepfather’s gun and committed suicide in February earlier this year.

A ninth-grader from Orlando, Florida, was talking to a chatbot on Character.AI, an app that provides users with “personal AI.” The app allows users to create their own AI characters or chat with existing characters. Till last month it had 20 million users.

According to chat logs accessed by the family, Sewell was in love with chatbot Daenerys Targaryen, whom he affectionately called ‘Dany’. During the conversation he expressed suicidal thoughts on various incidents.

In a chat, Sewell said, “I think about killing myself sometimes.” When the bot asked why he would do that, Sewell expressed his urge to be “free.” “From the world. From me,” he said, as seen in a screenshot of the chat shared by The New York Times.

In another conversation, Sewell mentioned his wish for a “quick death”.

Sewell’s mother, Megan L. Garcia filed a lawsuit against Character.AI this week, accusing the company of being responsible for his son’s death. According to the lawsuit, the chatbot repeatedly brought up the topic of suicide.

A draft of the complaint reviewed by the NYT says the company’s technology is “dangerous and untested” and could “trick customers into handing over their most private thoughts and feelings.”

“Sewell, like many kids his age, did not have the maturity or mental capacity to understand that the C.A.I. bot in the form of Daenerys was not real. C.A.I told him she loved him, and engaged in sexual acts with him for weeks, possibly months,” the lawsuit alleges, as reported in the New York Post.

“It seemed like she was missing him and saying she wanted to be with him. She even said that she wants him to be with her, no matter what the cost.”

Kishore started using Character.AI in April 2023. Sewell’s parents and friends were unaware that he had been tricked by a chatbot. But according to the lawsuit, he became “significantly withdrawn, spending more and more time alone in his bedroom and suffering from low self-esteem.”

He even quit his basketball team at school.

One day, Sewell wrote in her journal: “I love being in my room because I start to disconnect from this ‘reality’, and I feel more at peace, more connected with Denny and his So much more in love with you, and just happier.”

According to the lawsuit, he was diagnosed with anxiety and disruptive mood disorder last year.

“We are saddened by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.

The company said it has introduced new safety features, including a pop-up that directs users to the National Suicide Prevention Lifeline if they express suicidal thoughts, and a ban on “sensitive or suggestive content” for users under 18. Will make changes to “reduce the possibility of encountering”. ,

helpline
Vandrewala Foundation for Mental Health 9999666555 or help@vandrevalafoundation.com
TISS iCall 022-255211111 (Monday-Saturday: 8 am to 10 pm)
(If you need help or you know someone who needs help, please contact your nearest mental health professional.)
Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version