A hot potato: When Apple revealed last week that it’ll be using neuralMatch to scan all iPhone and iCloud accounts for CSAM (child sexual abuse material) when it rolls out iOS 15 this fall, privacy advocates expressed concern that this will be a serious new step in breaching user privacy. One major concern is that governments may demand that Apple add material to its watchlist outside neuralMatch’s original scope, such as political material.
Apple says it won't let governments expand the scope of its upcoming child protective image scanning system, but privacy concerns remain. This past weekend Apple published an FAQ clarifying how neuralMatch will work. Most importantly, Apple asserts it will only use the system to detect known CSAM, and that it isn’t designed to do anything else.
“Apple will refuse any such demands,” the FAQ reads. “Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.” Apple stresses that the system won’t automatically notify law enforcement, instead conducting human review of any flagged images.
The FAQ also tries to assuage concerns non-CSAM might be injected into neuralMatch in order to flag accounts. “Apple does not add to the set of known CSAM image hashes,” it says. “The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design.”
Some privacy advocates aren’t convinced. Security engineering and Royal Society professor Steven Murdoch pointed out past cases in which this promise hasn’t held. “[United Kingdom ISPs] lost in court and did it anyway. Will Apple leave a market if put in the same position?”
Apple allude to where they refused to build new functionality to unlock an iPhone. That’s different from adding a hash to an existing database. In other cases Apple have acceded to legal demands to reduce security, e.g. limiting availability of Private Relay and iCloud encryption— Steven Murdoch (@sjmurdoch) August 9, 2021
Apple’s system will check hashes of images against a database of images from the National Center for Missing & Exploited Children, which Apple won’t have control over.
Johns Hopkins University cryptography researcher Matthew Green theorized a scenario where the United States Department of Justice could simply go over Apple’s head to the NCMEC directly, asking it to add non-CSAM material, and the NCMEC could acquiesce without Apple knowing.
Now obviously NCMEC might say “no” to the request, but they might not, especially if it could be posed as protecting children. And Apple might only see hashes, so wouldn’t know this happened.— Matthew Green (@matthew_d_green) August 9, 2021
Preservation orders don’t require judicial review. https://t.co/MT5yYNmUh1
In a call, Apple told Motherboard’s Joseph Cox that the system is not launching in China, in response to a question of what Apple would do if the Chinese government demanded Apple scan for non-CSAM content.