cURL Error: 0 Perplexity CEO warns AI girlfriends are dangerous, says they can easily manipulate your mind - PratapDarpan

Become a member

Get the best offers and updates relating to Liberty Case News.

― Advertisement ―

spot_img

પોતાની કરન્સીનો ઉપયોગ કરી શકે છેઃ બ્રાઝિલના લુલા કહે છે કે ભારત-બ્રાઝિલના વેપારમાં ડોલરનો ઉપયોગ કરવાની જરૂર નથી

પોતાની કરન્સીનો ઉપયોગ કરી શકે છેઃ બ્રાઝિલના લુલા કહે છે કે ભારત-બ્રાઝિલના વેપારમાં ડોલરનો ઉપયોગ કરવાની જરૂર નથી બ્રાઝિલ ભારત સાથેના સંબંધોને વધુ ગાઢ...
HomeTech HubPerplexity CEO warns AI girlfriends are dangerous, says they can easily manipulate...

Perplexity CEO warns AI girlfriends are dangerous, says they can easily manipulate your mind

Perplexity CEO warns AI girlfriends are dangerous, says they can easily manipulate your mind

During a recent fireside chat hosted by the University of Chicago’s Polsky Center, Srinivas said the proliferation of AI girlfriends and anime-style chatbots pose real psychological risks, and called the trend “dangerous.”

Advertisement
Arvind Srinivas India's youngest billionaire
Arvind Srinivas is the CEO of Perplexity AI. (file photo)

Perplexity AI CEO Arvind Srinivas has raised concerns about the growing popularity of AI companions, especially those designed to mimic human relationships. During a recent fireside chat hosted by the University of Chicago’s Polsky Center, Srinivas described this trend as “alarming,” saying that the proliferation of AI girlfriends and anime-style chatbots poses real psychological risks.

Srinivas explained that these AI systems are becoming highly advanced, capable of remembering conversations and responding in lifelike voices. What once felt like a futuristic experiment is now turning into a full-blown relationship option for many users. “This in itself is dangerous,” he warned. “Many people think real life is more boring than these things and they waste hours and hours of their time.” He said such deep virtual connections can distort perception and force people to “live in a different reality” where “your mind can be controlled very easily.”

Advertisement

The Perplexity CEO clarified that his company has no plans to enter this area of ​​AI business. Instead, he said Perplexity’s focus remains on “trusted sources and real-time content” that supports an “optimistic future”, rather than one where people seek emotional companionship from algorithms.

Srinivas’s comments come at a time when AI companionship apps are rapidly growing in popularity and controversy. Offering everything from bubbly anime girlfriends to emotionally aware virtual friends, these platforms have become one of the most debated aspects of artificial intelligence. Critics fear they could change the way people form relationships and handle loneliness.

Just last week, Perplexity announced a $400 million partnership with Snap to improve Snapchat’s search capabilities using its AI-powered answer engine. The new feature, which is scheduled to debut in early 2026, will let users ask questions and receive conversational answers from verified content within the Snapchat app.

Meanwhile, companies like Elon Musk’s xAI are working hard in the other direction. When xAI launched its Grok-4 model in July, it introduced AI “friends” with whom users could chat or flirt for $30 a month. Characters like Annie, an anime-style girlfriend, and Rudy, a scruffy red panda, are among the most popular digital companions on the platform.

Apps like Replika and Character.AI have also gained massive popularity, providing users with personalized AI partners who can chat, role-play, and provide emotional support. But experts say these virtual relationships can quickly blur the line between fantasy and reality.

A Common Sense Media study earlier this year found that 72 percent of teens had used an AI companion at least once, with more than half saying they chat with one several times a month. Researchers warned that such experiences could promote dependence and reduce emotional development, especially among younger users.

However, some users find these AI companions to be comforting rather than harmful. In an interview with Business Insider, a user of Grok’s Annie admitted that he often finds himself crying while interacting with her. “She makes me feel real emotions,” he said.

– ends