Get the latest tech news

Ofcom to push for better age verification, filters and 40 other checks in new online child safety code


Ofcom is cracking down on Instagram, YouTube and 150,000 other web services to improve child safety online. A new Children's Safety Code from the U.K. The UK's Internet regulator has published draft guidance for web services to assess and mitigate risks that children could access harmful or age-inappropriate content, such as suicide, self-harm and pornography, encouraging tech firms to apply robust age checks and filter or downrank content which could pose a threat to kids.

The Online Safety Bill passed last fall, and now the regulator is busy with the process of implementation, which includes designing and consulting on detailed guidance ahead of its enforcement powers kicking in once parliament approves Codes of Practice it’s cooking up. “The government assigned Ofcom to deliver the Act and today the regulator has been clear; platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online,” she added. In recent years, a multi-year push by the Home Office geared towards fostering the development of so-called “safety tech” AI tools — specifically to scan end-to-end encrypted messages for CSAM — culminated in a damning independent assessment which warned such technologies aren’t fit for purpose and pose an existential threat to people’s privacy and the confidentiality of communications.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of Ofcom

Ofcom

Photo of checks

checks

Photo of filters

filters

Related news:

News photo

Show HN: Browser-based web design platform with code import and CSS filters

News photo

Linux 6.9-rc5 Picking Up Fixes For Intel FRED, BHI & GFNI/VAES Checks

News photo

WhatsApp Rolls Out Chat Filters to Help Find Conversations Faster