Trending

Stories

Apple Removes Controversial Child Abuse Detection Tool From Webpage

Must Read

Apple has removed all references to its controversial child sexual abuse material (CSAM) detection feature from its child safety webpage.

Announced in August, the CSAM feature is intended to protect children from predators who use communication tools to recruit and exploit them and to limit the spread of Child Sexual Abuse Material.

Also Read

It was part of the features including scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Two of the three safety features, released earlier this week with iOS 15.2, are still on the page titled “Expanded Protections for Children”.

However, references to the CSAM detection, whose launch was delayed following backlash from non-profit and advocacy groups, researchers, and others, have been removed, reports MacRumors.

The tech giant, however, said its position hadn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple had said in September.

Following the announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, Facebook’s former security chief, politicians, etc.

Apple endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

According to reports, an upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.

The second developer beta of iOS 15 (iOS 15.2) includes support for its new communication safety feature in Messages.

With this update, Apple Messages will be able to use on-device machine learning to analyze image attachments and determine if a photo being shared is sexually explicit, TechCrunch had reported.

Stay updated

Subscribe to our newsletter and never miss an update on the latest tech, gaming, startup, how to guide, deals and more.

Latest

Stories

- Advertisement -
- Advertisement -

Latest

Grow Your Business

Place your brand in front of tech-savvy audience. Partner with us to build brand awareness, increase website traffic, generate qualified leads, and grow your business.

- Advertisement -

Related

- Advertisement -
- Advertisement -
WhatsApp’s Companion Mode: Same Account, Multiple Devices Nvidia & MediaTek Collaborate on Connected Car Tech Sony Unveils Project Q: A Next-Gen Handheld Game Console Snapchat Hits 200M+ Users in India, Introduces AI Chatbot Super Mario Bros. Movie: 6.4 Million Tickets Sold in Japan Samsung Galaxy A14 Launches in India with Impressive Features BGMI Returns: Battlegrounds Mobile India Now on Play Store Twitter Accuses Microsoft of Data Use Policy Violation GIFs take over Instagram comments, unleashing creative expressions! The First US State to Ban TikTok: Montana’s Bold Move
WhatsApp’s Companion Mode: Same Account, Multiple Devices Nvidia & MediaTek Collaborate on Connected Car Tech Sony Unveils Project Q: A Next-Gen Handheld Game Console Snapchat Hits 200M+ Users in India, Introduces AI Chatbot Super Mario Bros. Movie: 6.4 Million Tickets Sold in Japan Samsung Galaxy A14 Launches in India with Impressive Features BGMI Returns: Battlegrounds Mobile India Now on Play Store Twitter Accuses Microsoft of Data Use Policy Violation