Get the latest tech news

Roblox, Discord, OpenAI and Google found new child safety group


Robust Open Online Safety Tools (ROOST) is a new organization founded by Roblox, Discord, Google and OpenAI to build open-source online safety tools.

The press release announcing ROOST specifically calls out plans to offer "tools to detect, review, and report child sexual abuse material (CSAM)." And rather than expect a smaller company or organization to create their own safety tools from scratch, ROOST wants to provide them, free of charge. At least some of the companies involved in ROOST, specifically Google and OpenAI, have also already pledged to stop AI tools from being used to generate CSAM.

Get the Android app

Or read this on Endgadget

Read more on:

Photo of Google

Google

Photo of OpenAI

OpenAI

Photo of Discord

Discord

Related news:

News photo

Decoding OpenAI’s Super Bowl ad and Sam Altman’s grandiose blog post

News photo

OpenAI Set To Finalize First Custom Chip Design This Year

News photo

Google-backed public interest AI partnership launches with $400M+ for open ecosystem building