It is believed to eventually be a “key ingredient” in adding surveillance to encrypted messaging systems.
Apple is reportedly developing a tool that would scan your iPhone photos for child sexual abuse material (CSAM), including the media content related to child pornography. The new development, which is expected will be announced soon, would be implemented on the client side — on the user’s device — to look for specific perceptual hashes and send them directly to Apple servers if they appear in a large quality. The idea is that by carrying out the checks on the user’s device, it protects their privacy though it is not clear whether this system could be misused in some way.
Cybersecurity expert Matthew Daniel Green, who works as an Associate Professor at the Johns Hopkins Information Security Institute in the US, tweeted about Apple’s plans to launch the client-side system to detect child abuse images from the iPhone. He said that the under-developing tool could eventually be a “key ingredient” in adding surveillance to encrypted messaging systems.
“The way Apple is doing this launch, they’re going to start with non-E2E [non-end-to-end] photos that people have already shared with the cloud. So it doesn’t ‘hurt’ anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal,” Green said in a detailed thread on Twitter.
Apple may raise user concerns through its new tool as even if there would be enough layers to protect misuse, it may turn up false positives. Governments may also be able to abuse the system to go beyond looking for illegal child content and search for media that could push public attitudes toward political engagements.
Gadgets 360 has reached out to Apple for a comment on the development of the reported tool and will update this space when the company responds.
In the past, Apple was found to have deployed similar hashing techniques to look for child abuse content in emails of its iPhone users. The Cupertino company was also last year reported to have dropped encrypted backups on its iCloud to silently provide a backdoor entry to law enforcement and intelligence agencies.
However, the new move seems to be done keeping privacy in mind as it will be deployed on the user’s device without needing to send images to the cloud. The exact scope of the tool is yet to be determined as Apple has not yet specified any official details, but Green tweeted that an announcement could take place this week.