A hot potato: When Apple revealed last week that it'll be using neuralMatch to scan all iPhone and iCloud accounts for CSAM (child sexual abuse material) when it rolls out iOS 15 this fall, privacy advocates expressed concern that this will be a serious new step in breaching user privacy. One major concern is that governments may demand that Apple add material to its watchlist outside neuralMatch's original scope, such as political material.

Apple says it won't let governments expand the scope of its upcoming child protective image scanning system, but privacy concerns remain. This past weekend Apple published an FAQ clarifying how neuralMatch will work. Most importantly, Apple asserts it will only use the system to detect known CSAM, and that it isn't designed to do anything else.

"Apple will refuse any such demands," the FAQ reads. "Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups." Apple stresses that the system won't automatically notify law enforcement, instead conducting human review of any flagged images.

The FAQ also tries to assuage concerns non-CSAM might be injected into neuralMatch in order to flag accounts. "Apple does not add to the set of known CSAM image hashes," it says. "The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design."

Some privacy advocates aren't convinced. Security engineering and Royal Society professor Steven Murdoch pointed out past cases in which this promise hasn't held. "[United Kingdom ISPs] lost in court and did it anyway. Will Apple leave a market if put in the same position?"

Apple's system will check hashes of images against a database of images from the National Center for Missing & Exploited Children, which Apple won't have control over.

Johns Hopkins University cryptography researcher Matthew Green theorized a scenario where the United States Department of Justice could simply go over Apple's head to the NCMEC directly, asking it to add non-CSAM material, and the NCMEC could acquiesce without Apple knowing.

In a call, Apple told Motherboard's Joseph Cox that the system is not launching in China, in response to a question of what Apple would do if the Chinese government demanded Apple scan for non-CSAM content.