Apple removes controversial child abuse detection tool from webpage
Apple has removed any reference to its controversial Child Sexual Abuse Material Detection (CSAM) feature from its child safety webpage.
Announced in August, the CSAM feature aims to protect children from predators who use communication tools to recruit and exploit them, and to limit the dissemination of child sexual abuse material.
This was part of the features including scanning users’ iCloud photo libraries for child sexual abuse material (CSAM), communications security to notify children and their parents when receiving or sending sexually explicit photos and extended CSAM advice in Siri and Search.
Two of the three security features, which were released earlier this week with iOS 15.2, are still featured on the page, titled “Extended Protections for Kids.”
However, references to CSAM detection, whose launch was delayed following backlash from nonprofit and advocacy groups, researchers and others, have been removed, MacRumors reports.
The tech giant, however, said its stance hasn’t changed since September, when it first announced it would delay the launch of CSAM detection.
“Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time over the next few months to gather feedback and make improvements before releasing these child safety features. critically important,” Apple said in September.
Following the announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, privacy whistleblower Edward Snowden, former Facebook security chief, politicians , etc
Apple has worked to clear up misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
According to reports, an upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.
The second developer beta of iOS 15 (iOS 15.2) includes support for its new communications security feature in Messages.
With this update, Apple Messages will be able to use on-device machine learning to analyze attached images and determine if a shared photo is sexually explicit, TechCrunch had reported.
(Only the title and image of this report may have been edited by Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)