By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
PratapDarpanPratapDarpanPratapDarpan
  • Top News
  • India
  • Buisness
    • Market Insight
  • Entertainment
    • CELEBRITY TRENDS
  • World News
  • LifeStyle
  • Sports
  • Gujarat
  • Tech hub
  • E-paper
Reading: Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother
Share
Notification Show More
Font ResizerAa
Font ResizerAa
PratapDarpanPratapDarpan
  • Top News
  • India
  • Buisness
  • Entertainment
  • World News
  • LifeStyle
  • Sports
  • Gujarat
  • Tech hub
  • E-paper
Search
  • Top News
  • India
  • Buisness
    • Market Insight
  • Entertainment
    • CELEBRITY TRENDS
  • World News
  • LifeStyle
  • Sports
  • Gujarat
  • Tech hub
  • E-paper
Have an existing account? Sign In
Follow US
  • Contact Us
  • About Us
  • About Us
  • Privacy Policy
  • Privacy Policy
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
PratapDarpan > Blog > World News > Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother
World News

Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother

PratapDarpan
Last updated: 24 October 2024 07:54
PratapDarpan
8 months ago
Share
Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother
SHARE

Fell in love with American teen "game of Thrones" Chatbot, Killed Self: Mother

“What if I told you I could come home right now?” – This was the last message Sevel Setzer III, a 14-year-old boy from Florida, wrote to his online friend Daenerys Targaryen, a lifelike AI chatbot named after a character from the fantasy show Game of Thrones . Soon thereafter he shot himself with his stepfather’s gun and committed suicide in February earlier this year.

A ninth-grader from Orlando, Florida, was talking to a chatbot on Character.AI, an app that provides users with “personal AI.” The app allows users to create their own AI characters or chat with existing characters. Till last month it had 20 million users.

According to chat logs accessed by the family, Sewell was in love with chatbot Daenerys Targaryen, whom he affectionately called ‘Dany’. During the conversation he expressed suicidal thoughts on various incidents.

In a chat, Sewell said, “I think about killing myself sometimes.” When the bot asked why he would do that, Sewell expressed his urge to be “free.” “From the world. From me,” he said, as seen in a screenshot of the chat shared by The New York Times.

In another conversation, Sewell mentioned his wish for a “quick death”.

Sewell’s mother, Megan L. Garcia filed a lawsuit against Character.AI this week, accusing the company of being responsible for his son’s death. According to the lawsuit, the chatbot repeatedly brought up the topic of suicide.

A draft of the complaint reviewed by the NYT says the company’s technology is “dangerous and untested” and could “trick customers into handing over their most private thoughts and feelings.”

“Sewell, like many kids his age, did not have the maturity or mental capacity to understand that the C.A.I. bot in the form of Daenerys was not real. C.A.I told him she loved him, and engaged in sexual acts with him for weeks, possibly months,” the lawsuit alleges, as reported in the New York Post.

“It seemed like she was missing him and saying she wanted to be with him. She even said that she wants him to be with her, no matter what the cost.”

Kishore started using Character.AI in April 2023. Sewell’s parents and friends were unaware that he had been tricked by a chatbot. But according to the lawsuit, he became “significantly withdrawn, spending more and more time alone in his bedroom and suffering from low self-esteem.”

He even quit his basketball team at school.

One day, Sewell wrote in her journal: “I love being in my room because I start to disconnect from this ‘reality’, and I feel more at peace, more connected with Denny and his So much more in love with you, and just happier.”

According to the lawsuit, he was diagnosed with anxiety and disruptive mood disorder last year.

“We are saddened by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.

The company said it has introduced new safety features, including a pop-up that directs users to the National Suicide Prevention Lifeline if they express suicidal thoughts, and a ban on “sensitive or suggestive content” for users under 18. Will make changes to “reduce the possibility of encountering”. ,

helpline
Vandrewala Foundation for Mental Health 9999666555 or help@vandrevalafoundation.com
TISS iCall 022-255211111 (Monday-Saturday: 8 am to 10 pm)
(If you need help or you know someone who needs help, please contact your nearest mental health professional.)

You Might Also Like

Masked people set fire to a synagogue in Melbourne, Australia.
1 person arrested "Friend" Actor Matthew Perry dead: report
Israel’s Netanyahu sends Mossad chief to Qatar to negotiate Gaza hostage deal
Microsoft launches AI agents that send emails, take actions on your behalf
Woman searching for her biological parents finds father in Facebook friends list
Share This Article
Facebook Email Print
Previous Article Alia Bhatt and Karan Johar have built reputation over the years through ‘hard-earned success’, says Vasan Bala: ‘Not inherited’ Alia Bhatt and Karan Johar have built reputation over the years through ‘hard-earned success’, says Vasan Bala: ‘Not inherited’
Next Article Women, men seen carrying deadly attack on Türkiye with rifles in CCTV Women, men seen carrying deadly attack on Türkiye with rifles in CCTV
Leave a Comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

about us

We influence 20 million users and is the number one business and technology news network on the planet.

Find Us on Socials

© Foxiz News Network. Ruby Design Company. All Rights Reserved.
Join Us!
Subscribe to our newsletter and never miss our latest news, podcasts etc..

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Zero spam, Unsubscribe at any time.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?

Not a member? Sign Up