Apple Answers Some Important Privacy-Related Questions in New CSAM FAQ

BY Sanuj Bhatia

Published 9 Aug 2021

tim cook ios 15 privacy video

Last week, Apple announced the controversial child safety features for iOS, iPadOS, and macOS. Despite the good use, CSAM features have received a lot of backlash raising questions about privacy. Today, the Cupertino-based giant published a detailed FAQ answering some of the important questions regarding CSAM and user privacy.

Apple posted a six-page FAQ after the feature drew a lot of backlashes. Privacy advocates, including Edward Snowden, have slammed the feature saying the feature will turn iPhone into “iNarcs.”

Apple CSAM FAQ

The FAQ beings with Apple re-assuring that the feature will not be used by any government for any use.

“Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it.”

Apple then highlights that the two features — CSAM safety in Messages and iCloud photo scanning — are entirely separate.

What are the differences between communication safety in Messages and CSAM detection in iCloud Photos?

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

When asked about government enforcing Apple to detect other things via CSAM features, Apple answers:

Could governments force Apple to add non-CSAM images to the hash list?

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

The rest of the FAQ goes into specifics of the three features, namely Communication safety in Messages, CSAM detection, and Security for CSAM detection for iCloud Photos. You can read more about the FAQ here.

With the FAQ, Apple is aiming to resolve all the issues regarding privacy and CSAM. But, do you feel convinced? How do you feel about Apple searching the iCloud Photo Library for CSAM? Do you think this is a breach of your privacy? Drop a comment and let us know your thoughts!