Get the latest tech news
Rec Room reduces toxic chat incidences by 70% with intelligent voice moderation
Intelligent voice moderation solutions slash problematic behavior in VR experiences. Learn more in this GamesBeat Next 2024 panel recap.
It’s vital that developers dig into how their users are relating to one another to understand how to mitigate harm, improve safety and trust, and encourage the kind of experiences that help players build community and stay for the long haul. Rec Room has seen a 70% reduction in toxic voice chat incidents over the past 18 months since rolling out ToxMod, as well as experimenting with moderation policies and procedures and making product changes, Hussain said. Instead, spend more time defining the what, who, how and why behind the problem, because you’ll design better solutions when you truly understand what’s behind instances of toxicity, code of conduct violations or whatever harm is manifesting, Hussain said.
Or read this on Venture Beat