Get the latest tech news
iOS 18.2 has a child safety feature that can blur nude content and report it to Apple
Apple’s iOS 18.2 introduces machine learning-based nude content detection that is entirely on-device and preserves end-to-end encryption. Initially arriving in Australia, this expansion of the Communication Safety feature aims to protect childrenfrom explicit content.
In iOS 18.2, Apple is adding a new feature that resurrects some of the intent behind its halted CSAM scanning plans — this time, without breaking end-to-end encryption or providing government backdoors. Rolling out first in Australia, the company’s expansion of its Communication Safety feature uses on-device machine learning to detect and blur nude content, adding warnings and requiring confirmation before users can proceed. The company likely chose the land Down Under for a specific reason: The country is set to roll out new regulations that require Big Tech to police child abuse and terror content.
Or read this on Endgadget