Get the latest tech news

People use AI for companionship much less than we’re led to believe


A report by Anthropic reveals that people rarely seek companionship from AI, and turn to AI for emotional support or advice only 2.9% of the time.

Anthropic says its study sought to unearth insights into the use of AI for “affective conversations,” which it defines as personal exchanges in which people talked to Claude for coaching, counseling, companionship, roleplay, or advice on relationships. However, the company notes that help-seeking conversations can sometimes turn into companionship-seeking in cases where the user is facing emotional or personal distress, such as existential dread or loneliness, or when they find it hard to make meaningful connections in their real life. Anthropic also highlighted other insights, like how Claude itself rarely resists users’ requests, except when its programming prevents it from broaching safety boundaries, like providing dangerous advice or supporting self-harm.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of people

people

Photo of companionship

companionship

Related news:

News photo

People use AI for companionship much less than we’re led to think

News photo

Windsurf CEO Varun Mohan throws cold water on 1-person, billion-dollar startup idea at VB Transform: ‘more people allow you to grow faster’

News photo

My "Are you presuming most people are stupid?" test