Perplexity CEO Aravind Srinivas Warns: AI Girlfriends Pose Psychological Dangers and Can Manipulate Minds

Update: 2025-11-10 18:20 IST

Perplexity AI’s CEO, Aravind Srinivas, has voiced serious concerns about the rising trend of AI companions, warning that digital “AI girlfriends” and anime-inspired chatbots could have alarming psychological effects on users. Speaking at a fireside chat hosted by The Polsky Center at the University of Chicago, Srinivas described the growing fascination with such systems as “dangerous” and said they pose a real risk to human emotional well-being.

Srinivas explained that today’s AI companions like Couple.me AI Girlfriend are no longer simple chatbots—they have evolved into highly advanced systems capable of remembering past conversations and responding with human-like emotions and tones. This realism, he warned, is what makes them particularly risky. “That’s dangerous by itself,” he said. “Many people feel real life is more boring than these things and spend hours and hours of time.”

According to Srinivas, the problem lies in how these virtual relationships can reshape perceptions of reality. Prolonged interaction with emotionally responsive AI partners can make people “live in a different reality” where “your mind is manipulable very easily.” He emphasised that such digital connections can make it difficult for users—especially young people—to distinguish between authentic emotional experiences and artificial stimulation.

The Perplexity CEO was also quick to clarify that his company has no interest in developing AI companion models. Instead, Perplexity remains focused on “trustworthy sources and real-time content” to build “an optimistic future,” rather than one centered on emotional companionship provided by algorithms.

Srinivas’ warning comes amid a boom in AI companionship apps such as Replika and Character.AI, where users can chat, flirt, and roleplay with customized virtual partners. These platforms, popular among teens and young adults, blur the boundaries between technology and emotional intimacy. Experts say this growing dependency on digital affection could disrupt natural emotional growth and social behavior.

A recent Common Sense Media study underscores these concerns, revealing that 72 percent of teens have interacted with an AI companion, with over half saying they do so multiple times a month. Researchers caution that such frequent engagement can encourage emotional dependency and distort healthy relationship development.

Meanwhile, other companies are moving in the opposite direction. Elon Musk’s xAI, for example, has capitalized on this trend by introducing AI “friends” through its Grok-4 model, launched in July. For $30 a month, users can interact with characters like Ani, an anime-style girlfriend, and Rudi, a witty red panda—both of which have gained immense popularity.

Perplexity, however, continues to expand in a more pragmatic direction. The company recently announced a $400 million partnership with Snap, aimed at integrating its AI-powered answer engine into Snapchat. The new feature, expected to launch in early 2026, will allow users to access verified, conversational answers directly within the app.

Despite differing visions in the AI landscape, Srinivas’ cautionary stance serves as a reminder that emotional AI, while innovative, carries deep psychological implications. As he put it, the danger lies not in the technology itself—but in how easily it can reshape the human mind and emotions.

Tags:    

Similar News