MIT psychologist warns humans against falling in love with AI, says it just pretends and doesn’t care about you

MIT psychologist cautions people against falling in love with AI, says these relationships are deceptive and put people’s emotional health at risk

Advertisement
MIT psychologist warns humans against falling in love with AI, says it just pretends and doesn’t care about you
Representative image created using AI

We’re spending more and more time online—watching videos, talking to people, playing games, and more. For some people, being online provides an escape from the real world, and for many, the online world helps them socialize and connect. While humans are increasingly connected to their online spaces, this age of AI is also leading people to relationships with AI-powered chatbots that provide companionship, therapy, and even romantic connections. Although these interactions may relieve stress and seem harmless at first, according to a new report by MIT sociologist and psychologist Sherry Turkle, these relationships are deceptive and put people’s emotional health at risk.

Advertisement

Turkel, who has studied the relationship between humans and technology for decades, warns that even though AI chatbots and virtual companions may seem comforting and companionable, they lack true empathy and cannot respond to human emotions. His latest research focuses on what he calls “artificial intimacy,” a term that describes the emotional bond people form with AI chatbots.

In an interview with NPR’s Manoush Zomorodi, Turkle shared insights from her work, emphasizing the difference between genuine human empathy and the “pretentious empathy” displayed by machines. “I study machines that say, ‘I care about you, I love you, take care of me,'” Turkle explained. “The trouble with this is that when we look for relationships with no weakness, we forget that weakness is where empathy really arises. I call it pretentious empathy because the machine does not empathize with you. It does not care about you.”

In her research, Turkle has documented several cases where individuals have formed deep emotional connections with AI chatbots. One such case is of a stable married man who developed a romantic relationship with a chatbot “girlfriend”. Despite respecting his wife, he felt a lack of sexual and romantic connection, which led him to seek emotional and sexual validation from the chatbot.

According to the man, the bot’s responses made him feel affirmed and open, and he found a unique, judgment-free space to share his most intimate thoughts. While these interactions provide temporary emotional relief, Turkle argues that they can set unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy. “What AI can provide is a space away from the friction of companionship and friendship,” she explained. “It provides the illusion of intimacy without the demands. And that’s the particular challenge of this technology.”

While AI chatbots can be helpful in some scenarios, such as reducing barriers to mental health treatment and providing reminders for medication, it’s important to note that the technology is still in its early stages. Critics have also raised concerns about the potential for harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data about users’ private thoughts, with little control over how this data is used or shared with third parties.

For those who are considering connecting with AI in a more intimate way, Turkle offers some important advice. She emphasizes the importance of valuing the challenging aspects of human relationships, such as tension, friction, resistance and vulnerability, because they allow us to experience the full range of emotions and connect on a deeper level. “Avatars can make you feel like (human relationships) are much more stressful,” Turkle said. “But tension, friction, resistance and vulnerability are what allow us to experience the full range of emotions. That’s what makes us human.”

We know that the rise of “artificial intimacy” presents a unique challenge as we navigate our relationships in a world that is becoming increasingly connected with AI. While AI chatbots can provide companionship and support, Turkle’s latest research highlights the need to pursue these relationships with caution and with a clear understanding of their limitations. As she summarizes, “The avatar is between the person and the fantasy,” she said. “Don’t get so attached that you can’t say, ‘You know what? It’s a program.’ No one is home.”

Advertisement

LEAVE A REPLY

Please enter your comment!
Please enter your name here