Apple removes controversial child abuse detection tool from webpage

Apple has removed any reference to its controversial Child Sexual Abuse Material Detection (CSAM) feature from its child safety webpage.

Announced in August, the CSAM feature aims to protect children from predators who use communication tools to recruit and exploit them, and to limit the dissemination of child sexual abuse material.

This was part of the features including scanning users’ iCloud photo libraries for child sexual abuse material (CSAM), communications security to notify children and their parents when receiving or sending sexually explicit photos and extended CSAM advice in Siri and Search.

Two of the three security features, which were released earlier this week with iOS 15.2, are still featured on the page, titled “Extended Protections for Kids.”

However, references to CSAM detection, whose launch was delayed following backlash from nonprofit and advocacy groups, researchers and others, have been removed, MacRumors reports.

The tech giant, however, said its stance hasn’t changed since September, when it first announced it would delay the launch of CSAM detection.

“Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time over the next few months to gather feedback and make improvements before releasing these child safety features. critically important,” Apple said in September.

Following the announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, privacy whistleblower Edward Snowden, former Facebook security chief, politicians , etc

Apple has worked to clear up misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

According to reports, an upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.

The second developer beta of iOS 15 (iOS 15.2) includes support for its new communications security feature in Messages.

With this update, Apple Messages will be able to use on-device machine learning to analyze attached images and determine if a shared photo is sexually explicit, TechCrunch had reported.

–IANS

na/ksk/

(Only the title and image of this report may have been edited by Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

Dear reader,

Business Standard has always endeavored to provide up-to-date information and commentary on developments that matter to you and that have wider political and economic implications for the country and the world. Your constant encouragement and feedback on how to improve our offering has only strengthened our resolve and commitment to these ideals. Even in these challenging times stemming from Covid-19, we remain committed to keeping you informed and updated with credible news, authoritative opinions and incisive commentary on relevant topical issues.
However, we have a request.

As we battle the economic impact of the pandemic, we need your support even more so that we can continue to bring you more great content. Our subscription model has received an encouraging response from many of you who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of bringing you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practice the journalism we are committed to.

Support quality journalism and subscribe to Business Standard.

digital editor

Comments are closed.