Get the latest tech news
Why Mark Zuckerberg Is Ditching Human Fact-Checkers
This week, we take a look at Meta's new era of content moderation.
Michael Calore: Today we're talking about content moderation because the big news over the past couple of weeks is that Meta is ending its third-party fact-checking program and replacing it with a Community Notes model. There's the trust and safety world, which I saw saying, "We really disagree with this change," but interestingly, a lot of them were not pointing to the switch from human moderators and algorithms over to Community Notes as much as they were the fact that Meta is not going to proactively look for lesser violations. Then there's the practical implementation of what he's putting forward, and I think everyone can agree content moderation on a platform that has at least 2 billion daily active users around the world, people logging into it every day, is a very hard problem to solve.
Or read this on Wired