Apple's plan to scan devices for child abuse imagery sparks heated internal debate

Cal Jeffrey

Posts: 4,181   +1,427
Staff member
Why it matters: Last week, Apple rolled out a preview for iOS 15 that features child abuse protection features that stirred up controversy among privacy advocates. The OS will scan users' messages and camera roll to find images depicting child abuse. In some instances, photos will be sent to the National Center for Missing and Exploited Children (NCMEC), leading to legal action or arrest.

Update: Apple responded to the controversy in an interview with the Wall Street Journal this morning (video below). Apple software chief Craig Federighi defended the feature and admitted that Apple flubbed the messaging during the rollout.

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” explained Federighi. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

Aside from the Fourth and Fifth Amendment implications, some argue these features set Apple down a slippery slope. Governments could pressure it to look for other potentially illegal materials. It has also stirred up controversy so much that Apple's rank and file are beginning to speak out against this practice.

Speaking under the condition of anonymity, several Apple staffers have told Reuters that the image scanning plan has sparked an 800 comment internal Slack thread, with most expressing concerns that this type of intrusion could lead to it being abused. The insiders said there had been similar internal debates over company policies or software features in the past, but none have been as hotly debated.

Some in the thread also expressed that this move damages the company's reputation on user privacy and security. Apple has always been quick to tout its stance on user privacy, but as the Electronic Frontier Foundation (EFF) put it, "Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor." This precise argument is why Apple butted heads with the FBI when requested to unlock suspects' iPhones in two different shootings.

Some senior and core security employees commented in the thread that they felt the measures were "a reasonable response to pressure to crack down on illegal material." Others added that they hoped the move was a step toward implementing full end-to-end encryption on iCloud. Management also issued an internal memo encouraging employees, saying that the NCMEC was proud of their effort. It also commented that all the protesting is just screeching from the minority (above).

Outside the company, the EFF and the Center for Democracy and Technology (CDT) have both issued objections to Apple's plan in just the last 24 hours. Another coalition of privacy advocates is also finalizing a letter of protest.

"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in," CDT project director Emma Llanso told Reuters. "It seems so out of step from everything that they had previously been saying and doing."

Apple declined to comment on the internal dustup but did say that it would refuse any government's request to scan for anything other than child abuse material. However, once the system is in place, any country's lawmakers could demand the checking be expanded. While Apple can deny such requests, an embargo from a large market like China could make the demands hard to refuse.

UK tech lawyer Neil Brown sums the situation up saying, "If Apple demonstrates that, even in just one market, it can carry out on-device content filtering, I would expect regulators/lawmakers to consider it appropriate to demand its use in their own markets, and potentially for an expanded scope of things."

Permalink to story.

 
Gain of Function Research Explained:

It is the weaponizing of a virus and spreading it worldwide, to infect the masses and cause them to create a dependency on the very people who created the weapon to protect those masses from that very same weapon

In order to achieve success, the weapons research team must first get immunity from prosecution so they can continue to harm the subjects while raking in massive profits regardless of the damage they cause, all the while pretending to be the solution to the very same problem they created

Now, if Microsoft can weaponize the most destructive virus known to man (Windows 11) , then why can't Apple do the same?
 
Last edited:
Even in this article we see the transition from "child abuse" to the much more general "pressure to crack down on illegal material". (And even "child abuse" can be expanded to cover a pretty wide range of behavior - it starts as a kidnapped child and can end up as a child exposed to unapproved anti-state messages by their parent.)

Apple should not open this door and our government should not let them if they tried. If we don't find some balance future generations may never have a single private, unmonitored discussion or even thought.

 
Karl Marx said "... the theory of the Communists may be summed up in the single sentence: Abolition of private property."

In other words, your property is no longer private. You do not have property. Not even your body belongs to you. Congratulations, you are now property of the state. Welcome to the Leftist Borg, where everyone is equal and has the same rights: no rights.
 

Karl Marx said "... the theory of the Communists may be summed up in the single sentence: Abolition of private property."

In other words, your property is no longer private. You do not have property. Not even your body belongs to you. Congratulations, you are now property of the state. Welcome to the Leftist Borg, where everyone is equal and has the same rights: no rights.

Like Klaus Schwab said, we'll 'own nothing and be happy'.
 
Funny how years ago anyone talking about OS backdoors was seen as a conspiracy theorist, now we've come to the point where companies are publicly admitting to putting backdoors and defending the practice claiming it will be for your own good. It's a slippery slope. Today it's Apple, tomorrow it will be Windows.
 
Funny how years ago anyone talking about OS backdoors was seen as a conspiracy theorist, now we've come to the point where companies are publicly admitting to putting backdoors and defending the practice claiming it will be for your own good. It's a slippery slope. Today it's Apple, tomorrow it will be Windows.
Windows had backdoors in every version since XP

Wake the Bleep up!
 
They can forget the markets of Thailand, Brazil, Mexico, India, Niger, Bangladesh, Chad, Mali, Burgina Faso, Mozambique, Russia for sure and a lot of others maybe, around 2+ billion people minimum. After all nobody died from sex, maybe they are the first.
 
When Apple's own people know there is a problem, the board of directors should be given an "oops" upside the head along with a swift boot in the backside..

Tim Cook needs to go along with any of the board of directors that supported this initiative.

It would seem personal rights are under attack from everyone these days. Remember people, if we do not stand strong and protect our rights, we lose them.
 
Back