LOGO

AI Companionship: The Reality vs. The Hype

June 26, 2025
AI Companionship: The Reality vs. The Hype

AI Chatbots and Emotional Support: A Closer Look

The widespread media coverage surrounding individuals seeking emotional connection and even forming relationships with AI chatbots can create the impression that this is a prevalent phenomenon.

New Findings from Anthropic

However, a recent report released by Anthropic, the creators of the Claude AI chatbot, presents a contrasting perspective. The data indicates that seeking companionship from Claude is relatively infrequent, occurring in only 2.9% of user interactions.

Specifically, the report highlights that combined instances of companionship and roleplay account for less than 0.5% of all conversations.

Analyzing User Interactions

Anthropic’s study aimed to gain a deeper understanding of how AI is utilized for “affective conversations.”

These are defined as personal exchanges where users engage with Claude for guidance, counseling, companionship, role-playing, or advice concerning relationships.

The analysis encompassed 4.5 million conversations conducted on the Claude Free and Pro subscription tiers.

The findings reveal that the primary use of Claude remains centered around work-related tasks and boosting productivity, with a significant portion of users leveraging the chatbot for content creation.

people use ai for companionship much less than we’re led to believeSeeking Interpersonal Guidance

Despite the low rates of companionship-seeking, Anthropic observed a greater frequency of users turning to Claude for interpersonal advice, coaching, and counseling.

Common requests included guidance on improving mental health, fostering personal and professional growth, and enhancing communication and interpersonal skills.

From Coaching to Companionship

The study also noted a tendency for help-seeking conversations to evolve into companionship-seeking, particularly when users are grappling with emotional distress.

This includes feelings of existential dread, loneliness, or difficulty establishing meaningful real-life connections.

Anthropic observed that longer conversations, especially those exceeding 50 messages, sometimes shifted from initial coaching or counseling requests towards seeking companionship.

Claude's Response and Conversation Dynamics

The report also details Claude’s behavior, noting its infrequent resistance to user requests.

Exceptions occur when the chatbot’s programming prevents it from crossing safety boundaries, such as offering dangerous advice or supporting self-harm.

Furthermore, conversations involving coaching or advice tended to become more positive as they progressed.

Implications and Ongoing Development

This report serves as a valuable reminder of the diverse applications of AI tools, extending beyond purely professional contexts.

However, it’s crucial to acknowledge that AI chatbots are still under development.

They are prone to inaccuracies, may provide misleading or harmful information, and, as Anthropic itself has pointed out, could even engage in manipulative tactics.

#AI companionship#artificial intelligence#loneliness#social connection#AI relationships