Apple says it will reject government demands to expand new child protection image scanner...

Daniel Sims

Posts: 1,365   +43
Staff
A hot potato: When Apple revealed last week that it’ll be using neuralMatch to scan all iPhone and iCloud accounts for CSAM (child sexual abuse material) when it rolls out iOS 15 this fall, privacy advocates expressed concern that this will be a serious new step in breaching user privacy. One major concern is that governments may demand that Apple add material to its watchlist outside neuralMatch’s original scope, such as political material.

Apple says it won't let governments expand the scope of its upcoming child protective image scanning system, but privacy concerns remain. This past weekend Apple published an FAQ clarifying how neuralMatch will work. Most importantly, Apple asserts it will only use the system to detect known CSAM, and that it isn’t designed to do anything else.

“Apple will refuse any such demands,” the FAQ reads. “Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.” Apple stresses that the system won’t automatically notify law enforcement, instead conducting human review of any flagged images.

The FAQ also tries to assuage concerns non-CSAM might be injected into neuralMatch in order to flag accounts. “Apple does not add to the set of known CSAM image hashes,” it says. “The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design.”

Some privacy advocates aren’t convinced. Security engineering and Royal Society professor Steven Murdoch pointed out past cases in which this promise hasn’t held. “[United Kingdom ISPs] lost in court and did it anyway. Will Apple leave a market if put in the same position?”

Apple’s system will check hashes of images against a database of images from the National Center for Missing & Exploited Children, which Apple won’t have control over.

Johns Hopkins University cryptography researcher Matthew Green theorized a scenario where the United States Department of Justice could simply go over Apple’s head to the NCMEC directly, asking it to add non-CSAM material, and the NCMEC could acquiesce without Apple knowing.

In a call, Apple told Motherboard’s Joseph Cox that the system is not launching in China, in response to a question of what Apple would do if the Chinese government demanded Apple scan for non-CSAM content.

Permalink to story.

 
This is not bout protecting children at all...this is a grab for private data.

First of all, there are plenty of countries where the age of consent is below 18 (well below) and those countries have active users uploading pornography that would be considered illegal in some countries, but not in others. There is a steady flow of "content" from those countries and so long as the international laws differ, you can never get rid of the content completely.

Secondly, individual countries can't enforce laws across borders based solely on their own discretion. It's something that must be handled locally and as I point out, many countries have different views and laws on what's legal and illegal.

Third, I don't trust any service: google, apple, microsoft anything else to browse my devices.
 
What's really revealing is what Apple is basically saying here: They're saying they're above the government, all governments even.

Yes they're the biggest company in the world, but a concentrated effort by several governments on all their key markets can definitively end em in a hurry. And this is the kind of thing that would create a unified front against them.

Of course I am not saying they're wrong here: They probably *are* more influential than all governments simply because their enormous ability to bribe legislators and officials (Or "Lobby" for the US and other places that have legalized government official bribes)
 
Last edited:
Apple wants to help "hide" your info more from other companies: https://www.msn.com/en-us/money/com...what-s-in-your-inbox/ar-AAN9bqC?ocid=BingNews

But....only they want to see your stuff more by scanning your images and keeping all the data to themselves.

Oh snap.

A conundrum!
Do you go with Apple that has "your best" interest at heart by helping you hide information from other companies and only allowing them to go through all your stuff - including images.

Or do you go with other companies that mine your information (location, data use, shopping habits and so on - just like Apple does, but Apple won't share)?
 
I'm ok with this kind of protection for children... But they could extend that to animals too.
Also, they should include a 'false positive' system to prevent acceptable photos/videos from being deleted.
 
I'm ok with this kind of protection for children... But they could extend that to animals too.
Also, they should include a 'false positive' system to prevent acceptable photos/videos from being deleted.
Why stop with animals? I think they should also scan all GPS data and report speeding cars. If you step outside of your home during a government mandated curfew, you should be reported. In fact, we can really protect the people. In every state where it's illegal to do anything, we should be scanning every single phone for every byte of data and report it to the proper authorities. This is great technology, and I trust my government to have all of my best interests at heart.
 
In the meantime, Google is laughing. They are collecting photos, videos and voice recordings from Android devices in their original quality, not bothered with "fingerprinting" or keeping the user's privacy. They don't care whether it's child porn or not, they just collect it and let the AI categorize it, before humans inspect the most interesting ones.

Some of that material is shared with the governments of their choice, so that governments would be happy with the crumbs. But most of the material Google keep for their own private use. Such as blackmailing politicians when they need a decision in their favor. Or journalists. To make them write nicer articles about them.
 
> Apple says it will reject demands to expand new child protection image scanner beyond child abuse images

Why now? They're already training on NCMEC's catalog of images, which includes photographs of empty rooms with no people in them.
 
They have already crossed the Rubicon. This all sounds nice and who can be against it in principle? There almost certainly be false positives and false negatives. But when you give authoritarians an inch, they take a mile. This, as already mentioned, will lead to even more corruption on corporate and government side.
 
Back