Apple says it's already scouring your emails for child abuse material

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: There's an epidemic of child sexual abuse material on online platforms, and while companies like Facebook have responded by flagging it wherever it popped up, Apple has been quietly developing a set of tools to scan for such content. Now that its response has come into focus, it's become a major source of controversy.

Earlier this month, Apple revealed that it plans to start scanning iPhones and iCloud accounts in the US for content that can be described as child sexual abuse material (CSAM). Although the company insisted the feature is only meant to help criminal investigations and that it wouldn't be expanded beyond its original scope regardless of government pressure, this left many Apple fans confused and disappointed.

Apple has been marketing its products and services in a way that created the perception that privacy is a core focus and high on the list of priorities when considering any new features. In the case of the AI-based CSAM detection tool the company developed for iOS 15, macOS Monterey, and its iCloud service, it achieved the exact opposite, and sparked a significant amount of internal and external debate.

Despite a few attempts to clarify the confusion around the new feature, the company's explanations have only managed to raise even more questions about how exactly it works. Today, the company dropped another bomb when it told 9to5Mac that it already scours iCloud Mail for CSAM, and it has been doing that for the past three years. On the other hand, iCloud Photos and iCloud Backups haven't been scanned.

This could be a potential explanation for why Eric Friedman -- who presides over Apple's anti-fraud department -- said in an iMessage thread (revealed in the Epic vs. Apple trial) that "we are the greatest platform for distributing child porn." Friedman also noted that Apple's obsession with privacy made its ecosystem the go-to place for people looking to distribute illegal content, as opposed to Facebook where the extensive data collection makes it very easy to reveal nefarious activities.

It turns out that Apple has been flying an "image matching technology to help find and report child exploitation" largely under the radar for the last few years, and only mentioned it briefly at a tech conference in 2020. Meanwhile, Facebook flags and removes tens of millions of images of child abuse every year, and is very transparent about doing it.

Apple seems to be operating under the assumption that since other platforms make it hard for people to do nefarious things without getting their account disabled, they'd naturally gravitate towards using Apple services to avoid detection. Scanning iCloud Mail for CSAM attachments may have given the company some insight into the kind of content people send through that route, and possibly even contributed to the decision to expand its CSAM detection tools to cover more ground.

Either way, this doesn't make it any easier to understand Apple's motivations, nor does it explain how its CSAM detection tools are supposed to protect user privacy or prevent governmental misuse.

Permalink to story.

 
"we're apple. You're concerned about privacy? Buy us! Oh, also, that whole invading your privacy for muh children? Yeah, turns out we've been doing that for months already and never told you! Look at our new camera lense, now pay us $1500!"

This wont affect sales of the iphone at all, guarantee you.
 
I don't understand why this is even legal!!!

I mean, the FBI can't walk into my house and start going thru things looking for illegal stuff, so why is it that Apple can do that???

And isn't Apple supposed to be a HUGE supporter of user privacy? Gee, maybe they really aren't!
 
Do they realize this could be very bad.
They do not have a warrant to search people's e mail or phones, grant it law enforcement may like the ability to identify possible criminals, and in some countries and states they may very well get away with using this evidence.
In the US there is a thing called "fruit of the poisonous tree" when it comes to evidence declared against defendants in the US court system, and essentially this is an illegal search and seizure of digital content from a private system where people have a reasonable expectation to privacy by a corporate entity and not a legal entity bound by civil systems created by law and bi laws. Without a warrant or suspicion of a crime being committed this is an illegal search and seizure.
Meaning the evidence itself cannot be used against criminals. Also any further evidence discovered brought forth because of a notification of this evidence is also inadmissible by law unless it would have been discovered naturally and in good faith, Apple does not qualify under the independent source clause exception because they are clearly stating they are a secured safe platform with privacy breaking the reasonable expectation this breaking the law..
While I fully understand the best of intentions, the greatest atrocities and violations of civil liberties have taken place under that very pretext.
Apple is playing with fire, and on top of that probably going to identify, as well as make those criminals free because they do not understand criminal law.
..also EULA are civil contracts, a civil contract can never bypass criminal law, in other words a civil agreement does not excuse a violation of civil rights afforded by constitutional laws or the bill of rights, while true civil contracts can bypass States laws it can only be done to a degree in which the state allows meaning the states pick and choose, federally prisoners convicted of a crime, working for a governmental institution or as a contractor of such, military are a few examples of a forfeit scenario of rights.
Apple is stupid.
 
I wonder if there are already SWAT-type services where you pay the darknet hacking group a few bucks and they email a bunch of illegal images to the spouse you're divorcing, the work colleague you're competing for a promotion against, or just that guy in your CS:GO game you don't like.
 
I don't understand why this is even legal!!!

I mean, the FBI can't walk into my house and start going thru things looking for illegal stuff, so why is it that Apple can do that???

And isn't Apple supposed to be a HUGE supporter of user privacy? Gee, maybe they really aren't!

You signed the EULA, so the worst you can do is file a class-action.

See also:

https://en.wikipedia.org/wiki/HumancentiPad

Unless you're deprogrammed, The Ghost of Steve Jobs is just going to convince you to go down darker-and--darker paths. All in the name of "GREATEST USER SAFETY EVAR (they pinkie-swear!) ".

I'm feeling double-plus good today about the whole thing(...don't talk here, the phones have eyes!!
 
Last edited:
I don't understand why this is even legal!!!

I mean, the FBI can't walk into my house and start going thru things looking for illegal stuff, so why is it that Apple can do that???

And isn't Apple supposed to be a HUGE supporter of user privacy? Gee, maybe they really aren't!
Well, in the US a certain political party, big tech, and the alphabet agencies all got into bed together. And following the rules, much less adhering to the constitution, means very little.

Spying on citizens under the guise of a righteousness cause.
 
I knew it , I knew it - I had my doubts about Iphone users - But Apple has confirmed it .
Or just maybe ,maybe it's a marketing stunt.

Hey Joe - you changed from an iphone to android - why's that Joe ?

so in summary Iphone users make up most of the child molestors ( at least in USA ).
Apple will spy on you.
You will need to creative to hide it .
If you change to another phone company - everyone will know what you are
 
LOL - Apple pretending they're some kind of moral shining icon for God. Gasp, they care about children. OMG, the avatar of Apple hath arriveth in the sacred grove, suckle its feet and be blesseth.

Excuse me while I exercise my vomit bag.
 
Reminder: Never keep any kind of sensitive data on a cloud, service or a device that is connected to the internet.
 
People expect the same company that has to put up suicide nets at its manufacturing plants to also protect their privacy...
 
They should change motto from "privacy is fundamental human right" to "privacy is fundamental human right except for people who abusing children or other people yet to be disclosed"
 
Apple says "privacy is a fundamental human right." Hey I am not English speaker. Maybe I am not understand what fundamental is meaning.
 
I don't understand why this is even legal!!!

I mean, the FBI can't walk into my house and start going thru things looking for illegal stuff, so why is it that Apple can do that???

And isn't Apple supposed to be a HUGE supporter of user privacy? Gee, maybe they really aren't!

Big difference between the government doing it and a private business that you agreed to the terms of service with
 
You signed the EULA, so the worst you can do is file a class-action.

See also:

https://en.wikipedia.org/wiki/HumancentiPad

Unless you're deprogrammed, The Ghost of Steve Jobs is just going to convince you to go down darker-and--darker paths. All in the name of "GREATEST USER SAFETY EVAR (they pinkie-swear!) ".

I'm feeling double-plus good today about the whole thing(...don't talk here, the phones have eyes!!
You signed nothing that is legally binding, also a EULA is civil not criminal, so A civil contract cannot bypass criminal actions.
 
Back