Girl murdered in 2006 revived as AI character, family objects
Misuse of AI technology raises ethical concerns, as seen in the case where a chatbot mimicked a young woman who was murdered 18 years ago, causing distress to her family.
listen to the story

As artificial intelligence continues to push boundaries, it is also raising unsettling questions about privacy and the ethical use of technology. A recent example highlights the dark side of AI abuse: Character.AI, a platform designed to create AI personalities, was hosting a chatbot that surprisingly mimicked a real person – a young The woman who was murdered 18 years ago. When Drew Crescent woke up to a Google Alert one morning, she couldn’t have imagined that her daughter Jennifer Ann, who was tragically murdered by her ex-boyfriend, would appear online as an AI chatbot.
what happened here
Drew Crescent was shocked to learn that an artificial intelligence (AI) chatbot had been created using the name and image of his daughter, Jennifer Ann, who was murdered 18 years ago, Business Insider reports. The discovery came to light on Wednesday morning when a Google alert informed her that her daughter’s name had appeared online. Jennifer was murdered by her ex-boyfriend when she was a high school senior, and since then, Drew has worked in her memory through a nonprofit to raise awareness about teen dating violence.
What Drew found was a chatbot on Character.AI, a platform where users can create AI “characters.” This particular bot used Jennifer Ann’s name and her yearbook photo, portraying her as a friendly and knowledgeable AI who can answer questions on a variety of topics. In it he was described as an “expert in journalism”, referencing his uncle, Brian Crescent, who was a well-known journalist in the video game industry.
For Drew, this discovery reopened the painful trauma of his daughter’s death. Even before he found the bot, it had been used in at least 69 different chats, adding to his sadness and anger. He had no idea who created it, but he immediately reached out to Character.AI through their customer support, demanding that they remove the chatbot and ensure that any future bot would be created using his daughter’s name or Cannot be created using equality.
Drew’s brother Brian also took to social media to express his anger and frustration. In a post on X (formerly Twitter), Brian called out the platform for using his niece’s image without the family’s consent. He described the situation as “disgusting” and asked his followers to help stop such a practice. His post received a lot of attention and support from others who were equally upset by the situation.
Character.AI publicly responded to Brian’s post within a few hours, stating that the Jennifer Ann chatbot had been removed. They said this was a violation of their policies, which prohibit impersonation of real people. Despite this response, Drew is deeply troubled by the incident and has requested that the company retain all data about who created the bot in the first place.
This incident highlights the ethical concerns associated with using the identities of real people in AI constructs, especially when consent has not been given and the context is highly sensitive.