What just happened? Apple has announced that it will delay, but not cancel, its plans to implement a system to scan iPhones and iCloud accounts for child sexual abuse material (CSAM). After Apple announced the feature last month, it sparked worry from privacy advocates as well as intense debate over whether it could expand beyond scanning for CSAM.

Apple released a statement to news organizations including Ars Technica, confirming the delay. "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]," the statement reads.

"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

Apple hasn't revealed any more details on its plans, such as how long the delay is for or what changes it will make to the scanning feature.

Early in August, Apple revealed it would start scanning iPhones and iCloud accounts when it releases iOS 15 this fall. A database of hashes of known CSAM images from the National Center for Missing & Exploited Children (NCMEC) would be stored on every device running one of Apple's operating systems. An AI would compare the hash of every image uploaded to iCloud against that database. If an image is flagged, Apple would put it up for human review and then determine if it needs to be sent to the NCMEC.

Many are concerned that governments could pressure Apple to scan images for more than just CSAM. Apple has already said it would reject such demands. At least one researcher posits a hypothetical situation in which the US government could sidestep Apple, and pressure the NCMEC to change the database that's stored on devices.

The Electronic Frontier Foundation (EFF) believes that no matter how tight a backdoor this may be, it's still a backdoor. Apple has made this same argument when refusing to unlock suspects' iPhones for law enforcement. There are also concerns that scanning people's phones violates the Fourth Amendment.

Apple's own employees have apparently hotly debated the issue. Some say it damages the reputation Apple has tried to build for prioritizing user privacy. Others think this new system may be a step towards implementing end-to-end encryption on iCloud.