Apple refuses to scan iCloud photos to detect child abuse

Apple refuses to scan photos in iCloud to detect child abuse. In an interview with The Wall Street Journal contributor Joanna Stern, Apple’s senior vice president of software development Craig Federighi said what happened to the company’s plan to scan photos in iCloud for SCAM (Child Sexual Abuse Material), that is, materials illustrating sexual abuse against children. The company abandoned this idea.

How many iPhones have iOS 16 installed?

What’s the plan?

The program was announced in August 2021. Then they said that not photos would be analyzed, but their hash. For a human, it would just be a character set like ioshdfh9iiklsdh86312jnb23e23091341j123412. Translating these symbols back into the image will not work, but for AI they will be enough to understand what exactly is shown in the photo – a tree or something illegal.

The AI was supposed to match the hash of the images with hashes provided by the National Center for Missing and Exploited Children (NCMEC). After a certain threshold value is reached, which is evidence of high accuracy, Apple will manually interpret the contents of the security vouchers. If confirmed, the iCloud account will be blocked and a report will be sent to the NCMEC, which cooperates with US law enforcement.

Criticism

Despite seemingly good intentions, Apple’s initiative was sharply criticized. This is not only a violation of user privacy, but also the ability to use the system for other purposes.

One of the options for the development of events was presented by cryptographer Matthew Green. He emphasized that modern AI gives a large percentage of false positives. However, the situation could turn out to be much worse: suppose an authoritarian government comes to Apple and provides a choice – leave our market or give us access to this system. And here the search may already begin not for child pornography, but for what threatens this very authoritarian government with the subsequent neutralization of iPhone owners.

There was another side that believed that it was not at all clear why Apple decided to take on such a burden.

Development of events

As a result, Apple suspended the implementation of this feature. And if until December 2021 someone remembered her, then this year, because of everything that happened, they managed to forget about her. Until the unexpected happened: Craig Federighi suddenly said that work on the system was stopped.

Craig talks about the iMessage feature announced at the end of 2021. It will send a notification to the child’s parents if they try to send their nude photo.

This iMessage feature started working on December 7th. While it is being tested on a limited number of users in the US, and then rolled out around the world in 2023. This innovation was implemented simultaneously with the launch of end-to-end encryption of many elements of the Apple ecosystem, including backups and messages.

Leave a Reply