Apple removes controversial CSAM detection feature from webpage, but says plans haven’t changed

Apple has updated a web page on its child safety features to remove all references to the controversial child pornography detection (CSAM) feature first announced in August. The change, which was spotted by MacRumors, appears to have taken place between December 10 and 13. But despite the change to its website, the company says its plans for the feature haven’t changed.

Two of the three safety features, released earlier this week with iOS 15.2, are still present on the page titled “Extended protections for children.” However, references to the more controversial CSAM detection, whose launch was delayed following backlash from privacy advocates, have been removed.

Contacted for comment, Apple spokesman Shane Bauer said the company’s position had not changed since September, when it first announced it would delay the launch of CSAM detection. . “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take extra time over the next few months to gather feedback and make improvements before releasing these features of safety of critically important children, ”the company said in September.

Importantly, Apple’s statement doesn’t say the feature has been canceled entirely. Documents describing how the feature works are still online on Apple’s website.

Apple’s CSAM detection feature was controversial when it was announced because it involves taking hashes of iCloud photos and comparing them to a database of hashes of known images of child sexual abuse. Apple says this approach allows it to report users to authorities if they are known to upload child abuse images without compromising the privacy of its customers more generally. It also indicates that the encryption of user data is not affected and that the scan should be performed on the device.

But critics argue that Apple’s system risks undermining Apple’s end-to-end encryption. Some have called the system a “backdoor” that governments around the world could push Apple to expand to include content beyond CSAM. For its part, Apple said it “would not accede to any request from a government to extend it” beyond CSAM.

While the CSAM detection feature has yet to receive a new launch date, Apple went on to release two of the other child protection features it announced in August. One is designed to alert children when they receive images containing nudity in messages, while the second provides additional information when searching for terms related to child exploitation through Siri, Spotlight, or Safari. Search. Both were rolled out with iOS 15.2, which was released earlier this week and which appears to have prompted Apple to update its webpage.

Comments are closed.