Why it matters: Last week, Apple rolled out a preview for iOS 15 that features child abuse protection features that stirred up controversy among privacy advocates. The OS will scan users' messages and camera roll to find images depicting child abuse. In some instances, photos will be sent to the National Center for Missing and Exploited Children (NCMEC), leading to legal action or arrest.

Update: Apple responded to the controversy in an interview with the Wall Street Journal this morning (video below). Apple software chief Craig Federighi defended the feature and admitted that Apple flubbed the messaging during the rollout.

"It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood," explained Federighi. "We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing."

Aside from the Fourth and Fifth Amendment implications, some argue these features set Apple down a slippery slope. Governments could pressure it to look for other potentially illegal materials. It has also stirred up controversy so much that Apple's rank and file are beginning to speak out against this practice.

Speaking under the condition of anonymity, several Apple staffers have told Reuters that the image scanning plan has sparked an 800 comment internal Slack thread, with most expressing concerns that this type of intrusion could lead to it being abused. The insiders said there had been similar internal debates over company policies or software features in the past, but none have been as hotly debated.

Some in the thread also expressed that this move damages the company's reputation on user privacy and security. Apple has always been quick to tout its stance on user privacy, but as the Electronic Frontier Foundation (EFF) put it, "Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor." This precise argument is why Apple butted heads with the FBI when requested to unlock suspects' iPhones in two different shootings.

Some senior and core security employees commented in the thread that they felt the measures were "a reasonable response to pressure to crack down on illegal material." Others added that they hoped the move was a step toward implementing full end-to-end encryption on iCloud. Management also issued an internal memo encouraging employees, saying that the NCMEC was proud of their effort. It also commented that all the protesting is just screeching from the minority (above).

Outside the company, the EFF and the Center for Democracy and Technology (CDT) have both issued objections to Apple's plan in just the last 24 hours. Another coalition of privacy advocates is also finalizing a letter of protest.

"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in," CDT project director Emma Llanso told Reuters. "It seems so out of step from everything that they had previously been saying and doing."

Apple declined to comment on the internal dustup but did say that it would refuse any government's request to scan for anything other than child abuse material. However, once the system is in place, any country's lawmakers could demand the checking be expanded. While Apple can deny such requests, an embargo from a large market like China could make the demands hard to refuse.

UK tech lawyer Neil Brown sums the situation up saying, "If Apple demonstrates that, even in just one market, it can carry out on-device content filtering, I would expect regulators/lawmakers to consider it appropriate to demand its use in their own markets, and potentially for an expanded scope of things."