Craig Federighi Attempts to Clear ‘Confusion’ Around Apple’s Controversial Child Safety Feature

BY Mahit Huilgol

Published 13 Aug 2021

craig apple m1 macbook

Last week Apple announced a new feature that scans photos on iCloud for Child Sexual Abuse Material (CSAM.) Soon enough, Apple sent an internal memo assuring employees that the CSAM feature is safe. Now Apple’s senior vice president Craig Federighi has cleared the air in a new interview with The Wall Street Journal. Federighi has revealed privacy mechanisms built into the new feature in a bid to alleviate confusion.

Federighi set a tone for the interview by accepting that Apple had mismanaged last week’s announcement. The new features are related to weeding out explicit content on iMessage and scanning iCloud Photos for CSAM.

It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.

Apple VP further added that announcing both the features together was perhaps not a good idea. He says that people connected both the features and naturally had privacy concerns. Security researchers and privacy advocates aired concerns over how government and other parties could get hold of personal data. Federighi says the data will be subjected to “multiple levels of auditability.”

The feature alerts authority only when 30 or more pictures on the iPhone library match CSAM content. Next up, Apple examines the alert and will confirm whether the images are genuine or just a false alarm. He adds, that the entire matching process is local, and data is not sent to iCloud’s servers. Apple will use a database of images from multiple child protection organizations, and the process would include an independent auditor.

Our Take

Apple’s explanation seems to do very little to clear the confusion. We have seen how Apple contractors ended up listening to hours of private conversation while grading Siri conversations in the past. Furthermore, the feature is not water-tight as someone who is into CSAM will stop using iCloud. Are you satisfied with Apple’s explanation for the CSAM feature? Share your thoughts.

[via WSJ]