This week Apple announced new policy measures taken to detect content relating to child pornography on Messages and users’ iCloud. Moreover, Apple also plans on updating Siri and search functions to detect whenever any user searches for child pornography. The Electronic Frontier Foundation or the EFF has issued a warning which essentially decries the perils it might bring to user privacy. Apple mentioned that this feature will be launched later this year with the release of iOS 15, iPadOS 15, and macOS Monterey.

Apple is planning a three-tier move which includes adding new tools to the built-in Messages on iOS, iPadOS, and macOS which will affect the users who use family accounts, and the machine learning program on the device will analyze the contents of the messages. If the machine detects that a message received by a child contains pornographic images, it will automatically blur those images and issue a warning. If the child, however, chooses to view the image anyway, the parent account linked to that account will receive a message informing them of the same. Similarly, if a child attempts to send such images, the system will issue a warning to the parents prior to sending the message.

The second move is to stop the dissemination of such contents and for that Apple will scan the user’s photos stored on iCloud. Apple will not directly scan the iCloud photos but use the CSAM database and use an encryption technology called Private Set Intersection. Apple will only take action after a certain number of photos are flagged by the CSAM database and after manually checking if those photos really are child pornography. If it is found to be photos relating to child pornography, Apple will shut down the user account and report to NCMEC.

Finally, Apple will also improve Siri and search functions. If any user searches for CSAM-related content, the user will receive a message explaining that the content they are trying to access is harmful.

EFF’s main concern is that since machine learning can identify and analyze child pornography so it can find and analyze other contents stored in a user’s iCloud.