Apple scrubs its support pages of all mentions of its controversial CSAM image scanning...

Cal Jeffrey

Posts: 4,179   +1,424
Staff member
A hot potato: Apple's controversial CSAM (child sexual abuse material) scan appears to have been canned. The company quietly cleansed its child safety support pages of all mention of the formerly upcoming iOS feature. The functionality was already on indefinite hold, so the content removal could mean that it entirely canceled the project. Apple has not commented on the situation.

Apple first announced CSAM scanning in early August, and it immediately stirred up criticism from privacy advocates. Cupertino engineers were designing the system to anonymously scan devices for images containing child abuse using a hashing system. If the algorithms found enough hashes, it would escalate the pictures to human review and potentially to law enforcement.

The inherent problems with the system were readily evident. Those who were found to have CSAM would face prosecution on evidence gathered under a blatant violation of their Fourth Amendment rights. Additionally, proponents had concerns that such a system could produce false positives, at least at the machine level. Innocent users could have their photos viewed by another human without their permission if the scan returned enough hashes. There was also unease over the possibility that oppressive governments could order scanning of dissidents.

Apple argued at the time that people were misinterpreting how the scanning would work and promised that it would never cave to governmental demands for the system's misuse. In a misguided attempt to address the backlash, Apple mentioned that it had already been using the CSAM algorithms on emails in iCloud for the last three years.

Instead of stemming concerns, the email scanning admission worked to stir the pot even more. Pressure grew to the point that Apple indefinitely delayed the rollout of CSAM scanning, stating that it wanted to make improvements based on "feedback from customers, advocacy groups, researchers and others."

The Electronic Frontier Foundation applauded the postponement but said nothing short of entirely abandoning the project was enough (above). In the meantime, the support pages still contained full explanations on how the system worked until recently.

On Wednesday, MacRumors saw that the content regarding CSAM was missing from the website. Cupertino has yet to comment on the removal. Presumably, Apple has put the project on the back burner until it figures out how to implement it without raising user hackles or is canceling it altogether.

Permalink to story.

 
“Those who were found to have CSAM would face prosecution on evidence gathered under a blatant violation of their Fourth Amendment rights.”

The proposal had hashes created on the device but the hashes and images were accessed in iCloud after the user uploaded their photos. Cloud providers have the right to view and use the data such as photos you upload to their cloud services and they can voluntarily provide law enforcement with access to the data on their servers without a user’s permission. Additionally the fourth amendment refers to actions by the government, which this is not. There was some speculation that the US government would not be able to compel apple to use its capability to search for other content (e.g. a specific picture) because that would violate the fourth amendment.

Where is your source? Or did you just make this up to pad out the article?
 
I think that CSAM would have been more appropriately spelled if the C and the S swapped positions, just like everything else that Apple does.
 
“Those who were found to have CSAM would face prosecution on evidence gathered under a blatant violation of their Fourth Amendment rights.”

The proposal had hashes created on the device but the hashes and images were accessed in iCloud after the user uploaded their photos. Cloud providers have the right to view and use the data such as photos you upload to their cloud services and they can voluntarily provide law enforcement with access to the data on their servers without a user’s permission. Additionally the fourth amendment refers to actions by the government, which this is not. There was some speculation that the US government would not be able to compel apple to use its capability to search for other content (e.g. a specific picture) because that would violate the fourth amendment.

Where is your source? Or did you just make this up to pad out the article?
No, I didn't make anything up. I reported on the concerns voiced by privacy advocates when the news broke. And for the sake of argument, once the authorities become involved it does become a Fourth Amendment issue, unless police, FBI, or whomever have become exempt from the constitution when I wasn't looking.

As for scanning in the cloud or on servers, nobody was questioning that (well they were, but that didn't apply to the phone scanning). Apple claimed that the scanning would occur on-device, NOT in the cloud. From its own now scrubbed support pages:

"Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations."

So by its own description, scanning would be done on the phone. The source is Apple itself. (https://web.archive.org/web/20210813092136/https://www.apple.com/child-safety/)
 
No, I didn't make anything up. I reported on the concerns voiced by privacy advocates when the news broke. And for the sake of argument, once the authorities become involved it does become a Fourth Amendment issue, unless police, FBI, or whomever have become exempt from the constitution when I wasn't looking.

As for scanning in the cloud or on servers, nobody was questioning that (well they were, but that didn't apply to the phone scanning). Apple claimed that the scanning would occur on-device, NOT in the cloud. From its own now scrubbed support pages:

"Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations."

So by its own description, scanning would be done on the phone. The source is Apple itself. (https://web.archive.org/web/20210813092136/https://www.apple.com/child-safety/)
I don’t think you have the correct interpretation of a user’s rights, it doesn’t matter when the algorithm performs a hash generation, performing image analysis is a feature of the operating system and the user already agreed to apple modifying its software at will in the terms of service. Apple receives the hashes and the images in the cloud after the user uploads them willingly (icloud photo sync) it never proposed to upload images without user consent, it then examines the cloud copy of the image when those images had a high hash match.

Once they are on apple’s servers, apple is free to examine a user’s images for any reason and if they believe the images uploaded to their servers are illegal, they are free to provide those images, logs and the users details to law enforcement who may use that legally obtained and provided data to investigate and prosecute.

Law enforcement is a government body and restricted from searching private data without a warrant, but it would not be performing any searches of private data, so they have no fourth amendment restriction.

I think apple was silly to do this on device rather than in the cloud like other companies do, but I don’t think your statement that there is a breach of the US constitution is correct.
 
I don’t think you have the correct interpretation of a user’s rights, it doesn’t matter when the algorithm performs a hash generation, performing image analysis is a feature of the operating system and the user already agreed to apple modifying its software at will in the terms of service. Apple receives the hashes and the images in the cloud after the user uploads them willingly (icloud photo sync) it never proposed to upload images without user consent, it then examines the cloud copy of the image when those images had a high hash match.

I think apple was silly to do this on device rather than in the cloud like other companies do, but I don’t think your statement that there is a breach of the US constitution is correct.

The part in bold has already been totally debunked, but even if it were true you still would've reached a new level of self-contradiction.
 
And the moral of the story is boyz 'n gurlz, if you're that ungodly f**king stupid enough to keep kiddie porn on your phone, buy an Android device.

Hey wait though, I though the Apple logo was all about defloration and original sin. Just exactly how old were Adam and Eve? Would their photos be flagged? And who would you report them to anyway?
 
Back