Get the latest tech news
Bluesky ramps up content moderation as millions join the platform
A new 100 person team steps in as concerns about child sexual abuse material rise.
In an exclusive with Platformer, Bluesky explained that it would be quadrupling its content moderation team, currently a 25-person contracted workforce, in order to curb a worrisome influx of child sexual abuse material (CSAM) and other content that violates the sites' community guidelines — cases that have so far fallen through the existing moderation systems and warrant human oversight. "The surge in new users has brought with it concomitant growth in the number of tricky, disturbing, and outright bizarre edge cases that the trust and safety team must contend with," the company wrote. Branded as a user-powered, decentralized social network, Bluesky prioritizes an "ecosystem of third-party providers" and eschews a "centralized moderation authority" in favor of user customizability.
Or read this on Mashable