Apple Delays Rollout of Controversial CSAM Feature to ‘Make Improvements’

BY Rajesh Pandey

Published 3 Sep 2021

 

tim cook ios 15 privacy video

Following all the criticism, Apple has delayed the rollout of its CSAM features, including scanning iCloud Photos and media received in iMessage for child sexual abuse content.

Apple’s move comes after it was heavily criticised for scanning iCloud Photos for CSAM content. Many users and privacy experts believe this move from Apple could eventually lead governments and other organisations with power to misuse it for their own benefit.

The company will now take “additional time” to make improvements to its child safety features.

Below is Apple’s full statement on the matter:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple initially announced that it would roll out CSAM scanning with the release of iOS 15 and macOS Monterey later this year. After the initial backlash, Apple posted a detailed FAQ explaining how CSAM works and why its privacy-friendly.

Apple’s statement does not make it clear when it would now release its CSAM detection system.