Disclaimer: We may earn a commission if you make any purchase by clicking our links. Please see our detailed guide here.

Follow us on:

Google News
Whatsapp

Apple to Scan iCloud Photos and Analyze the Contents of Messages

Bipasha Mandal
Bipasha Mandal
Bipasha Mondal is writer at TechGenyz

Join the Opinion Leaders Network

Join the Techgenyz Opinion Leaders Network today and become part of a vibrant community of change-makers. Together, we can create a brighter future by shaping opinions, driving conversations, and transforming ideas into reality.

This week Apple announced new policy measures taken to detect content relating to child pornography on Messages and users’ iCloud. Moreover, Apple also plans on updating Siri and search functions to detect whenever any user searches for child pornography. The Electronic Frontier Foundation or the EFF has issued a warning which essentially decries the perils it might bring to user privacy. Apple mentioned that this feature would be launched later this year with the release of iOS 15, iPad 15, and macOS Monterey.

Apple is planning a three-tier move which includes adding new tools to the built-in Messages on iOS, iPad, and macOS, which will affect the users who use family accounts. The machine learning program on the device will analyze the contents of the messages. If the machine detects that a message received by a child contains pornographic images, it will automatically blur those images and issue a warning. If the child chooses to view the image anyway, the parent account linked to that account will receive a message informing them of the same. Similarly, if a child attempts to send such images, the system will issue a warning to the parents prior to sending the message.

The second move is to stop the dissemination of such content and for that, Apple will scan the user’s photos stored on iCloud. Apple will not directly scan the iCloud photos but use the CSAM database and use an encryption technology called Private Set Intersection. Apple will only take action after a certain number of photos are flagged by the CSAM database and after manually checking if those photos really are child pornography. If it is found to be photos relating to child pornography, Apple will shut down the user account and report to NCMEC.

Finally, Apple will also improve Siri and search functions. If any user searches for CSAM-related content, the user will receive a message explaining that the content they are trying to access is harmful.

EFF’s main concern is that since machine learning can identify and analyze child pornography, it can find and analyze other contents stored in a user’s iCloud. 

Join 10,000+ Fellow Readers

Get Techgenyz’s roundup delivered to your inbox curated with the most important for you that keeps you updated about the future tech, mobile, space, gaming, business and more.

Recomended

Partner With Us

Digital advertising offers a way for your business to reach out and make much-needed connections with your audience in a meaningful way. Advertising on Techgenyz will help you build brand awareness, increase website traffic, generate qualified leads, and grow your business.

Power Your Business

Solutions you need to super charge your business and drive growth

More from this topic