Get the latest tech news
Call of Duty uses AI to detect 2 million toxic voice chats
Activision's AI voice moderation in Call of Duty detects over 2 million toxic chats, with plans for multilingual expansion.
Activision warned that anyone detected to have violated the code of conduct would either be globally muted from voice and text chat, as well as restricted from other social features. “Call of Duty is dedicated to combating toxicity within our games and will empower our teams to deploy and evolve our moderation technology to fight disruptive behavior, whether it be via voice or text chat. Her areas of speciality span a wide range, including technology, Diversity, Equity, and Inclusion (DEI), social politics, mental health, and nonfiction books.
Or read this on ReadWrite