Disclaimer: We may earn a commission if you make any purchase by clicking our links. Please see our detailed guide here.

Follow us on:

Google News
Whatsapp

Apple Refuses to Let Govts Spy via Its Child Abuse Detection Tool

IANS
IANS
Meet the voice behind Indo-Asian News Service (IANS), a storyteller navigating the currents of global events with precision and depth. Crafting narratives that bridge cultures, IANS brings you the pulse of the world in every word

Join the Opinion Leaders Network

Join the Techgenyz Opinion Leaders Network today and become part of a vibrant community of change-makers. Together, we can create a brighter future by shaping opinions, driving conversations, and transforming ideas into reality.

Facing criticism from several quarters over its iCloud Photos and messages for child safety initiatives, Apple has stressed that it will not allow any government to conduct surveillance via the tool aimed at detecting and curbing child sexual abuse material (CSAM) in iCloud photos.

Last week, Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery.

Apple said it would not accede to any government’s request to expand the technology.

“Apple will refuse any such demands. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” the company said in a new document.

Apple said the tool does not impact users who have not chosen to use iCloud Photos.

“There is no impact to any other on-device data. This feature does not apply to Messages,” the company noted.

Epic Games CEO Tim Sweeney had attacked Apple over its iCloud Photos and messages for child safety initiatives.

“This is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government,” Sweeney posted on Twitter.

WhatsApp Head Will Cathcart had also slammed Apple over its plans to launch photo identification measures, saying the Apple software can scan all the private photos on your phone, which is a clear privacy violation.

Stressing that WhatsApp will not allow such Apple tools to run on his platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), “but the approach they are taking introduces something very concerning into the world”.

Apple said that the ‘CSAM detection in iCloud Photos’ tool is designed to keep CSAM off iCloud Photos without providing Apple information about photos other than those that match known CSAM images.

“This technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it”.

The company further said that the feature does not work on private iPhone photo library on the device.

Join 10,000+ Fellow Readers

Get Techgenyz’s roundup delivered to your inbox curated with the most important for you that keeps you updated about the future tech, mobile, space, gaming, business and more.

Recomended

Partner With Us

Digital advertising offers a way for your business to reach out and make much-needed connections with your audience in a meaningful way. Advertising on Techgenyz will help you build brand awareness, increase website traffic, generate qualified leads, and grow your business.

Power Your Business

Solutions you need to super charge your business and drive growth

More from this topic