Apple feature designed to protect children by scanning images kicks up controversy

Cal Jeffrey

Posts: 3,163   +872
Staff member
A hot potato: During WWDC 2021 in June, Apple unveiled its upcoming device operating systems. It made a bid deal about the expanded privacy features of iOS, iPadOS, and macOS Monterey. What it didn't elaborate on is its expanded protections for children, and for good reason. At face value, Apple's protections for children run contrary to its tough stance on user privacy.

In the most recent iOS 15 preview, Apple rolled out some features that have many privacy advocates, including the Electronic Frontier Foundation (EFF) crying "backdoor." The features are part of Apple's effort to crack down on Child Sexual Abuse Material (CSAM).

The first feature uses machine learning to look for potentially sensitive images within the Messages app of children under 12. If inappropriate material is received, the picture is blurred, and a notification tells them it is okay not to view the photo along with links to "helpful resources." The child is also informed that if they do open the image, their parents will be notified. It also works if the child attempts to send an explicit photo. They will receive a warning that if they send the image, their parents will receive a notification.

Apple says that all AI processing is done on the device to protect users' privacy, and nothing is ever uploaded to Apple servers. It will work on all Apple device operating systems.

The second is called CSAM detection. CSAM refers to content that depicts sexually explicit activities involving a child. A database of known images from the National Center for Missing and Exploited Children (NCMEC) is downloaded and stored on the device as hash values. Before a user uploads a photo iCloud, the AI will compare hash values. If the image has enough hits, the content will be manually validated then sent to NCMEC, which handle any legal actions.

While nobody would argue against keeping children safer, it seems to be Apple's approach that is raising hackles. The EFF feels that the new features introduce the opportunity for countries to pressure Apple to scan for other content that has been deemed illegal.

"If these features work as described and only as described, there's almost no cause for concern. But the 'if' is the rub."—John Gruber

"That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," the EFF said. "At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Others have contrary opinions saying that this could be better for security. Tech blogger John Gruber, the inventor of the Markdown markup language, wrote in Daring Fireball:

"In short, if these features work as described and only as described, there's almost no cause for concern. But the 'if' in 'if these features work as described and only as described' is the rub. That 'if' is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you're still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future."

Gruber speculates that Apple may see this as an initial step in implementing end-to-end encryption in iCloud.

It also raises Fourth Amendment issues. Is scanning a device regardless of how securely it is done violating the Fourth Amendment's protections against warrantless search and seizure? It would seem that the technology and the way it is being implemented is something like a backdoor by proxy for law enforcement to search a phone without probable cause.

Apple critics are sure to lambaste the company and its platform for this move, while many fans will take the stance that it's welcome since they have no CSAM on their devices anyway and would like to see their children protected.

Regardless of how you look at it, this is undoubtedly a controversial issue that the community will hotly debate in the weeks leading up to the fall release of its operating systems. Before jumping to one side of the fence or the other, you should read Apple's explanation as well as the several relevant FAQs and technical documents posted to its website.

Permalink to story.

 

ZedRM

Posts: 631   +395
Apple enrages people smart enough to understand what Apple is really doing? What a shocker..

Tim Cook needs to go...

It also raises Fourth Amendment issues. Is scanning a device regardless of how securely it is done violating the Fourth Amendment's protections against warrantless search and seizure?
The short answer is yes. Whether it be by government or private entity, the protections provided by the fourth amendment against unlawful search & seizure apply equally to all citizens. These protections are not open for debate, discussion or negotiation. They are a concrete right and are not mute just because Tim Cook gets a wild hair up his bum. This effort using the guise of "child safety" is wildly transparent and pathetic.
 
Last edited:

VitalyT

Posts: 5,947   +6,207
This is a lot of writing about nothing, desperately trying to make a wave for the sake of doing it, a political smudge.
 

psycros

Posts: 3,613   +4,459
First off, the initial press release about this did NOT say it was about sexual exploitation - it said CHILD ABUSE. So Apple is already retconning the message to hide their real intent - getting people conditioned to the idea of corporate spyware running on their devices. This has nothing to do with child safety and probably *everything* to do with furthering the vast left-wing conspiracy detailed in Time Magazine.
 

Hexic

Posts: 1,052   +1,560
TechSpot Elite
Particularly ironic.

As Snowden’s meme suggests - fighting the FBI for the right of a confirmed, murderous domestic terrorist really is a good look when you suggest passively scanning everyone’s data 6 years later under the weak guise of “child safety”.

Ironic how all the fanboys whom preached Apple as the Lord & Savior of device privacy are oddly silent…

Have that cake and eat it too, Tim. Along with your boot.
 

kiwigraeme

Posts: 554   +426
I think the likes of Google scanning images posted openly on the WWW is fine .
But I seen stories - I police turning up on their door - because some zealot - saw nude toddler photos at the print/film developing shop . Nearly all kids under 2 or 3 love running around naked - playing in the warm muddy rain etc - Crikey some of us would enjoy doing it as adults .
The problem is once tagged on a sexs offender list - you may not know and have no recourse .
Just like some people are unlucky to be on intelligence lists.
Oh just happened to be Milan in a rental car near a red Brigade protest . 2 weeks later travelling around just happen to be near G8 protests in the Nederlands with same rental car .
Wonder why you always get that red light at the airport "no just a random check Sir"
Now in a USA wonder why you can't attend the school camp as a parent helper , or get that job "-
With the school you might have a chance finding out - but it might just be - we have enough parents - even though you are top Mtn biker , outdoors person, chef or whatever
 

PEnnn

Posts: 638   +620
Just when you think there can't possibly be decent people out there who would protect the rights of pedophiles to commit a predatory crime....then you see this article and the comments.

But hey, they hate Apple, so that's their cover!
 

Beerfloat

Posts: 303   +529
Just when you think there can't possibly be decent people out there who would protect the rights of pedophiles to commit a predatory crime....then you see this article and the comments.

But hey, they hate Apple, so that's their cover!

Some concerns seem reasonable to me. As far as Apple is concerned, it might just be wisest to simply stay out of this business altogether.

We're talking about personal devices that are owned by someone. 14th Amendment concerns were mentioned, and there's also a reason why we have a 5th Amendment to protect from self-incrimination.

And once Apple demonstrates that they are willing and able to do this much, you can bet pressure to aid in surveillance relating to other crimes will soon follow. I can just see future headlines going 'Apple refuses to aid treason investigation', or what about arms/drugs/human trafficking or whatnot. Slippery slope starts here.
 

m4a4

Posts: 2,546   +3,037
TechSpot Elite
Just when you think there can't possibly be decent people out there who would protect the rights of pedophiles to commit a predatory crime....then you see this article and the comments.

But hey, they hate Apple, so that's their cover!
That is what we call a false premise.

No one, so far, has stated they're against finding pedophiles. They are against the means (and what doors that will open). And there are very reasonable positions in the comments, so it just seems like you're blindly defending Apple (despite potentially opening pandora's box further).

This system could easily be used in secret to find other undesirable pictures in other more restrictive parts of the world...

And a backdoor, as tightly locked as can be, is still a backdoor.
 

mrvco

Posts: 135   +128
I'm not looking forward to having my front door breached at 2AM by some overzealous LE SWAT team because of a diaper rash photo I sent to my child's pediatrician to determine whether we needed to come in for an appointment or not. As they say, the path to hell is paved with good intentions.
 

Norsiiii

Posts: 84   +110
Just when you think there can't possibly be decent people out there who would protect the rights of pedophiles to commit a predatory crime....then you see this article and the comments.

But hey, they hate Apple, so that's their cover!
I want to search all of your digital devices and print media located within your home to make sure that you're not harboring any illicit child abuse content. I will also go through any and all online cloud storage locations attached to any of your accounts, and your full internet history to make sure it's all above board. You will of course consent to allow me to undertake this search, because after I said the words "to make sure you're not a pedophile" that means that all of your rights magically ceased to exist....

Absolute smooth-brain thinking, pal.
 

negroplasty

Posts: 547   +36
Particularly ironic.

As Snowden’s meme suggests - fighting the FBI for the right of a confirmed, murderous domestic terrorist really is a good look when you suggest passively scanning everyone’s data 6 years later under the weak guise of “child safety”.

Ironic how all the fanboys whom preached Apple as the Lord & Savior of device privacy are oddly silent…

Have that cake and eat it too, Tim. Along with your boot.
Apparently some people are though.

Yes, it seems many either misunderstand how this feature works (especially in comparison to storing their same photos on other services), or are worried about their sketchy photo collections. This is clearly a move Apple is making in preparation to go fully e2e encrypted with Photos, quote me on this in a year.

And just in case you didn't get the point from my last reply to you, let me clarify: nobody is defending the non-existent "rights" of pedophiles to commit predatory crimes, you liar, we are defending the rights of EVERYBODY ELSE who are not pedophiles to not have big brother thumbing through their every digital record not only without a warrant but without even the slightest just cause...

It's pretty mind blowing that you fail to understand that

Right... then put your photos on Google or Microsoft's servers and, oh wait, they'll scan all your photos. It's literally no different. ****, Microsoft even looks for copyrighted content.

Let's run through some possible scenarios:

1) You don't have sketchy photos and have nothing to worry about; don't like your photos being scanned, don't use *any* cloud storage service. Turn off iCloud Photos and you're done.

2) You have sketchy photos stored in iCloud. I don't care if you took them, or if you got them some other way, if they're in the database that means a pedophile was caught with them in their collection... if you took those photos, innocent or otherwise, you're a shitty parent. I doubt anyone is going to get thrown in the slammer for a picture that shows a baby's ***, but if you've got pictures that clearly show a child's genitals... for whatever purpose, you're a dangerous weirdo. Werido because you don't need them, and dangerous to assume they'd never leave your custody and be used in a way that could harm your child or someone else's. The world is full is sick and disgusting individuals, they might even be family members, don't create material for them.

3) You are an actual child predator and it gets you caught. Good, I hope the punishment is castration (if applicable) followed by being put down.

4) The big bad government orders Apple to expand the criteria of what's being searched for. How is this different than if you had your photos in OneDrive, or some other service that scans *all* your photos in the cloud already? It could be done just the same and with far less effort involved to make such a change. See #1.

Prediction: Apple is going to implement e2e encryption for Photos, and this is its way of getting the government off its back about shielding those who are, in actual fact, hurting helpless children, whether it's directly or indirectly.

 
Last edited:

Shaitan

Posts: 123   +129
Bwahhahaha. And to think I was seriously considering Apple iPad as a device for home use...
At this point, I'd rather buy a Huawei tablet here in Europe. At least the Chinese are not lying about spying on me.
This is a privacy nightmare for any reasonable person. And the reason why I am trying to learn Linux so as to run my personal things as I intend to, not as Apple or Google or whomever intends me to.
The "think of the children" or "think of the terrorists” was explained well enough in the Snowden movie. Which I think should be revisited by any of the gullible *****s putting their trust into governments.
 

cliffordcooley

Posts: 12,994   +6,315
Right... then put your photos on Google or Microsoft's servers and, oh wait, they'll scan all your photos. It's literally no different. ****, Microsoft even looks for copyrighted content.
If that is what is taking place I have no problem with it. It is after all their servers. But if they are scanning everyone's phones. I do have a problem with that.
 

wiyosaya

Posts: 6,530   +4,918
First off, the initial press release about this did NOT say it was about sexual exploitation - it said CHILD ABUSE. So Apple is already retconning the message to hide their real intent - getting people conditioned to the idea of corporate spyware running on their devices. This has nothing to do with child safety and probably *everything* to do with furthering the vast left-wing conspiracy detailed in Time Magazine.
Watch out. Leftists are known to hide under the beds of right-wingers at night along with all the other right-wing boogeymen conspiracies. :laughing:
 

Uncle Al

Posts: 8,237   +7,009
Since the constitution doesn't directly address any form of "right to privacy" and our government isn't about to give up the hooks they already have in the Patriot Act as well as others, there simply isn't much chance of reversing this EXCEPT by the power of the pocketbook ..... but let's face it, these younger generations simply don't understand what they have given up and until they figure it out there isn't much change of any changes other than getting steadily worse ...... This isn't the blame of any single party, it is the evolution of ignorance when it comes to law .....
 

negroplasty

Posts: 547   +36
If that is what is taking place I have no problem with it. It is after all their servers. But if they are scanning everyone's phones. I do have a problem with that.

Apple is only scanning on device photos that are stored in iCloud. Don't store photos in iCloud, which presumably you already weren't due to concerns, so what has changed exactly?

From the FAQ:
CSAM detection in iCloud Photos is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.
 

ZedRM

Posts: 631   +395
Right... then put your photos on Google or Microsoft's servers and, oh wait, they'll scan all your photos. It's literally no different. ****, Microsoft even looks for copyrighted content.
Those scans are being done on servers owned by them. That's not the problem. The problem is scanning a persons personal device for data. That is what is foul and wrong. It is unethical, immoral and unlawful on every level. And doing it under the guise of "child safety" is as slimy as you can get. It is pathetically transparent and must NOT be tolerated.
 

Cubi Dorf

Posts: 367   +244
I am having no problem with giving tool to help parent keep their childrens safe. The other part is where it is problem. You should not be policed by your own device. What is next? Your car report you to police if you are drive over speed?
 

BadThad

Posts: 646   +696
I am having no problem with giving tool to help parent keep their childrens safe. The other part is where it is problem. You should not be policed by your own device. What is next? Your car report you to police if you are drive over speed?

Shhhhhhh....don't give democrats any ideas!