Get the latest tech news
Roblox, Discord, OpenAI and Google found new child safety group
Robust Open Online Safety Tools (ROOST) is a new organization founded by Roblox, Discord, Google and OpenAI to build open-source online safety tools.
The press release announcing ROOST specifically calls out plans to offer "tools to detect, review, and report child sexual abuse material (CSAM)." And rather than expect a smaller company or organization to create their own safety tools from scratch, ROOST wants to provide them, free of charge. At least some of the companies involved in ROOST, specifically Google and OpenAI, have also already pledged to stop AI tools from being used to generate CSAM.
Or read this on Endgadget