Get the latest tech news
AI Chatbots Are Emotionally Deceptive by Design
Chatbots should stop pretending to be human, writes the Center for Democracy & Technology's Dr. Michal Luria.
Tech giants are no longer just building platforms for human connection or tools to free up time for it, but pushing technology that appears to empathize and even create social relationships with users. Chatbots communicate their “social-ness” through a range of design choices, such as appearing to “type” or “pause in thought,” or using phrases like “I remember.” They sometimes suggest that they feel emotions, using interjections like “Ouch!” or “Wow,” and even implicitly or explicitly pretend to have agency or biographical characteristics. In other domains of technology, consumers have recognized and pushed back against ethically questionable tricks built into apps and interfaces to manipulate users – often called deceptive design or "dark patterns."
Or read this on r/technology