Apple Announces New Child Safety Measures Coming to iOS: Photo Scanning, iMessage Blur, More

BY Sanuj Bhatia

Published 5 Aug 2021

apple child safety measure

Apple today previewed new child-safety measures coming to iOS and iPadOS. The new features include iCloud Photo Library scanning, safety measures in iMessage, enhanced detection of Child Sexual Abuse Material (CSAM) content in iCloud, and updated knowledge information for Siri and Search.

Apple says the features will be available only in the United States at launch, however, it is expected that they’ll be expanded to more and more regions in time to come. Apple says “protecting children is an important responsibility.”

Messages Safety Features

Apple says that when a child is in an iCloud Family, and he/she receives or sends a sexually explicit photo, the child will see a warning message. A warning saying the “image is sensitive” will come up. If the child opts to”View Photo,” another pop-up explaining why the photo is considered sensitive will come up.

At this moment, an iCloud Family parent will receive a notification “to make sure they’re OK.” The prompt will also include a link to receive additional help.

Apple says the feature uses on-device machine learning to analyze image attachments and then decides if the photo is sexually explicit or not. Apple again reiterates that iMessage is end-to-end encrypted and Apple doesn’t gain access to any message or photo.

Scanning Photos for Child Sexual Abuse Material (CSAM)

csam icloud photos how scan

This is the feature that leaked earlier today. Apple says that iCloud will now be able to determine when CSAM photos are stored in the cloud. The company says that if it finds any instances of such photos stored in the iCloud, it will be able to report them to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Essentially, what this feature does is that it matches the photo stored in the cloud against a database of CSAM photos provided by NCMEC. If any violation is found, Apple may report it. The company says all the matching is done on-device.

The matching is done on-device before the image is uploaded to iCloud. Apple says before the upload starts, matching is done against “unreadable CSAM hashes.” If there’s a match, it creates what Apple calls a “cryptographic safety voucher.” If the system crosses a number of thresholds,  a review is triggered in the system.

Apple says its threshold system is robust and offers significant privacy benefits over existing techniques:

• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can’t learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.

Siri and Search

Apple is also expanding safety features to Siri and Search in iOS 15. Siri can now help you report CSAM or child exploitation. The assistant will point towards the resource and teach how to file a report. Search will intervene users from searching anything related to CSAM. “These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” says Apple.

Post developing…

What are your thoughts on Apple’s new child-safety measures? Let us know in the comments section below!