Get the latest tech news
People use AI for companionship much less than we’re led to believe
A report by Anthropic reveals that people rarely seek companionship from AI, and turn to AI for emotional support or advice only 2.9% of the time.
Anthropic says its study sought to unearth insights into the use of AI for “affective conversations,” which it defines as personal exchanges in which people talked to Claude for coaching, counseling, companionship, roleplay, or advice on relationships. However, the company notes that help-seeking conversations can sometimes turn into companionship-seeking in cases where the user is facing emotional or personal distress, such as existential dread or loneliness, or when they find it hard to make meaningful connections in their real life. Anthropic also highlighted other insights, like how Claude itself rarely resists users’ requests, except when its programming prevents it from broaching safety boundaries, like providing dangerous advice or supporting self-harm.
Or read this on TechCrunch