Apple to scan all iPhones and iCloud accounts for child abuse images

midian182

Posts: 9,722   +121
Staff member
A hot potato: Apple has revealed plans to scan all iPhones and iCloud accounts in the US for child sexual abuse material (CSAM). While the system could benefit criminal investigations and has been praised by child protection groups, there are concerns about the potential security and privacy implications.

The neuralMatch system will scan every image before it is uploaded to iCloud in the US using an on-device matching process. If it believes illegal imagery is detected, a team of human reviewers will be alerted. Should child abuse be confirmed, the user's account will be disabled and the US National Center for Missing and Exploited Children notified.

NeuralMatch was trained using 200,000 images from the National Center for Missing & Exploited Children. It will only flag images with hashes that match those from the database, meaning it shouldn't identify innocent material.

"Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices," reads the company's website. It notes that users can appeal to have their account reinstated if they feel it was mistakenly flagged.

Apple already checks iCloud files against known child abuse imagery but extending this to local storage has worrying implications. Matthew Green, a cryptography researcher at Johns Hopkins University, warns that the system could be used to scan for other files, such as those that identify government dissidents. "What happens when the Chinese government says: 'Here is a list of files that we want you to scan for,'" he asked. "Does Apple say no? I hope they say no, but their technology won't say no."

Additionally, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure. The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. But Green also said that someone could trick the system into believing an innocuous image is CSAM. "Researchers have been able to do this pretty easily," he said. This could allow a malicious actor to frame someone by sending a seemingly normal image that triggers Apple's system.

"Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content," Green added. "Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."

The new features arrive on iOS 15, iPadOS 15, MacOS Monterey, and WatchOS 8, all of which launch this fall.

Masthead credit: NYC Russ

Permalink to story.

 
I'll be keeping this on my notes for their next conference when they talk about being super concerned with user privacy for 10 minutes straight: Just as we suspected it was bs, always has been.
 
I don't get it. They shouldn't be able to pick and choose what benefits them....

Apple pushes hard to fight against (so we're told, but I still don't really believe it is for the consumer) tracking users and using their data.
Apple refuses to unlock phones for government/police bodies for possible terrorist threats.
Apple decides it's okay to view your pictures for possible child abuse.

Talk about hypocrisy.
 
I don't get it. They shouldn't be able to pick and choose what benefits them....

Apple pushes hard to fight against (so we're told, but I still don't really believe it is for the consumer) tracking users and using their data.
Apple refuses to unlock phones for government/police bodies for possible terrorist threats.
Apple decides it's okay to view your pictures for possible child abuse.

Talk about hypocrisy.
That's called baiting the sheep. RIP to those preaching about switching to iOS for privacy concerns.
 
"The neuralMatch system will scan every image before it is uploaded to iCloud in the US using an on-device matching process. If it believes illegal imagery is detected, a team of human reviewers will be alerted."

This makes me want to think there's some kind of legal reason for not just scanning the files on the iCloud servers, possibly along the lines of once it's on the iCloud server Apple becomes responsible for hosting such content and could also be held partially accountable or would be required to give up access. Where as if they catch it on the end users device they simply divert the photos and have some poor bastards have to review them before alerting the authorities. If ever the evidence is subpoenaed Apple doesn't have to grant access to their main iCloud servers this way. Makes you wonder if they ran the scan on their iCloud servers, found an alarming amount of CSAM and needed a work around way of reporting this to authorities without compromising their total control of their servers.

Or maybe not, I just can't trust Apple.
 
Well if you want to stay private you can always buy Android. Oh wait that’s even worse at protecting your data.

Silicon Valley have teamed up to make sure no one can compete with their products and services. If a competitor that actually valued privacy showed up google and Facebook would conveniently ensure their services don’t work as well on that device/service. If a competing social media application starts getting too big, google and Apple remove it from their app stores citing some bullshit security issue or something.

If the USA had a competent administration that actually looked out for the Johnny public they would have created a digital bill of rights long ago. Currently police need a warrant to search your home but your phone is fair game, they can look at that whenever they like for any reason.
 
Unsurprisingly, the "Privacy Advocates" of TechSpot don't much care for catching pedophiles.

Wonder why.
 
"On January 24th, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like 1984." You are having to wait until 2021 for that to be starting.
 

Imagine being in one of those crazy group chats and someone send a CP photo.

You erase it in your device, but knowing these devices, that is still somewhere, then its scanned like this and boom, you are in jail and registered.

this will end really well....
 
Unsurprisingly, the "Privacy Advocates" of TechSpot don't much care for catching pedophiles.

Wonder why.
I’m sure everyone here would be ok with catching and punishing pedophiles to the max. What you are missing is that in an ever more fascistic world, opening the door to violations of the constitutional requirement for a warrant to search and seizure is not just a slippery slope, it is a cliff. Start with pedophiles, move on to images of guns, drugs, being in the company of persons of interest, being in public without a mask etc…
To deny that this is the risk is willful ignorance of history.
 
Unsurprisingly, the "Privacy Advocates" of TechSpot don't much care for catching pedophiles.

Wonder why.
Unsurprisingly, another strawman argument.

We’re discussing the irony and hypocrisy of Apple’s “privacy” policies.

To keep with the topic of conversation, you could however rephrase your statement as:

“Unsurprisingly, the ‘Privacy Advocates’ of Apple don’t much care for assisting with domestic terrorism investigations in their own country.”
 
The fact that people think this is a great idea is just sick. The excuse for them to say that its for child abuse etc. is BS. This is just opening a door for many more other things. Another reason why I don't buy or support anything Apple period.
 
Does that mean Apple will have a back door to your encrypted data so it can scan and send them the content if they deem it necessary?
 
On the one hand, another proof that the U.S. is a police state, thanks to the lib-con undying love for the bourgeois state apparatus' repression of PoC, unionists and anti-militarists.

On the other hand, a lot of people who spent the last few years denouncing "paedophiles in the government" will be caught with 12GB of CP in their files, and maybe through false flags they'll also catch some more homophobic "fervorous christians" that have large collections of gay porn. And in both cases that will be hilarious.
 
And suddenly a lot of noted conservatives will be having kiddie porn magically appear on their iphones. Anyone who's paid attention for the last two years knows what this is all about.
Dont forget one token Democrat that will be thrown under the bus as well to make look like its impartial.
 
Does that mean Apple will have a back door to your encrypted data so it can scan and send them the content if they deem it necessary?
From my read of this:

I think they are only sending the partial fingerprint of the file with the file when you upload it to icloud yourself, they use that to identify and review those images once the threshold number is reached in icloud storage and they then decrypt and review those files.

iOS always has full access to your unencrypted files and messages on your device.

Your images matching the csam fingerprints are not remotely retrieved from your device.

For the images you choose to store in icloud, apple has access to the encryption key used to decrypt them today.

Just my interpretation from a more complete description on another site.
 
Back