Apple Could Launch Tool to Detect Child Abuse Material in iPhone Photo Library

BY Rajesh Pandey

Published 5 Aug 2021

iPhone 12

Apple will reportedly release a tool that will scan the photos on your iPhone using hashtag algorithms and match them with photos of child pornography and child abuse materials. The tool will scan for photos that are backed up to the cloud.

Initially, the tool will do a client-side scan for photos stored in the cloud since they are not encrypted. All data stored in iCloud is not end-to-end encrypted. They are encrypted as long as they are on your iPhone. This way, the company will ensure that it does not breach the privacy of its users.

The scanning will happen on the user’s device itself rather than on Apple’s servers. Any match would then be assisted with a human review for further verification.

Apple likely already employs similar techniques for the various ML and AI features in its Photos and Camera app. It is possibly expanding the same tech to identify CSAM (Child Sexual Abuse Material) stored on a user’s device and ultimately catch the culprit. The problem is that such a tool could be modified and used for identifying other content stored in the cloud.

Security expert Matthew Green has highlighted the cons of such a tool in a Twitter thread.

Apple could also expand the tool in the future to work with end-to-end encrypted content. Law enforcement agencies could also get the court to order Apple to find specific content stored in iCloud accounts using such a hashing technique.