Get the latest tech news

iOS 18.2 has a child safety feature that can blur nude content and report it to Apple


Apple’s iOS 18.2 introduces machine learning-based nude content detection that is entirely on-device and preserves end-to-end encryption. Initially arriving in Australia, this expansion of the Communication Safety feature aims to protect childrenfrom explicit content.

In iOS 18.2, Apple is adding a new feature that resurrects some of the intent behind its halted CSAM scanning plans — this time, without breaking end-to-end encryption or providing government backdoors. Rolling out first in Australia, the company’s expansion of its Communication Safety feature uses on-device machine learning to detect and blur nude content, adding warnings and requiring confirmation before users can proceed. The company likely chose the land Down Under for a specific reason: The country is set to roll out new regulations that require Big Tech to police child abuse and terror content.

Get the Android app

Or read this on Endgadget

Read more on:

Photo of Apple

Apple

Photo of iOS

iOS

Photo of nude content

nude content

Related news:

News photo

Apple Discontinued These Four Products This Year, But One Will Return

News photo

Apple teases MacBook Pro M4 launch date with week of announcements

News photo

Apple Releases New AirPods Pro 2 Firmware Ahead of Hearing Aid Feature Launch