How Do You Feel About Apple Scanning Your iCloud Photos Library for CSAM?

BY Rajesh Pandey

Published 9 Aug 2021

apple child safety measure

Last week, Apple announced several new measures to prevent the spread of Child Sexual Abuse Material (CSAM). In total, Apple announced three changes, but it has raised eyebrows with its approach towards CSAM detection. Basically, your iPhone will scan the photos uploaded to iCloud to check for CSAM.

Using on-device processing, Apple will scan the images uploaded to your iCloud Photos and fingerprint them against known CSAM images. This will allow the company to prevent the spread of CSAM and report any matches that it comes across to the National Center for Missing and Exploited Children.

Below is how Apple explains the process:

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Apple will never be able to view your photos or scan them to determine what objects they contain. It will just fingerprint photos and determine if they are CSAM images. As Gruber from Daring Fireball points out, though, many US tech giants are already hash matching CSAM content against NCMEC’s database. Apple never took part in this for privacy reasons, but the company is changing its stance now.

An important point here is that fingerprint hash matching cannot happen with an end-to-end encrypted service. This means Apple cannot scan the photos you share in iMessage, WhatsApp, or Telegram. Only the photos stored in your device are scanned when they are uploaded to iCloud Photos. And if you want, you can disable the iCloud Photo Library to disable the CSAM scanning.

However, many iPhone users are not happy with this change. This is also a very tricky situation as law enforcement agencies can force Apple to use this tool to fingerprint and find other content stored on a user’s iPhone.

How do you feel about Apple using on-device hash matching the photos in your iCloud library for CSAM? Do you think this is a breach of your privacy? Drop a comment and let us know your thoughts!