LOGO

Apple iCloud Photo Scanning for Child Abuse Images Confirmed

August 5, 2021
Apple iCloud Photo Scanning for Child Abuse Images Confirmed

Apple's New CSAM Detection Technology

Apple is preparing to introduce a new system later this year designed to identify and report known child sexual abuse material (CSAM) to law enforcement agencies. The company emphasizes that this technology is built with user privacy as a central consideration.

Protecting Children Online

According to statements made to TechCrunch, the detection of CSAM is a key component of a broader set of features intended to enhance the safety of children utilizing Apple’s services. These features include filters to prevent the transmission of potentially explicit photos via iMessage. Furthermore, the system will proactively respond when a user searches for CSAM-related content through Siri and Search.

A Departure from Traditional Cloud Scanning

While many cloud service providers – including Dropbox, Google, and Microsoft – routinely scan user files for prohibited or illegal content, Apple has historically avoided this practice. The company has consistently offered users the option to encrypt their data before it is uploaded to iCloud, preventing Apple from accessing it.

Introducing NeuralHash

Apple’s new approach, utilizing a technology called NeuralHash, operates directly on the user’s device. It can determine if a user is uploading known child abuse imagery to iCloud without decrypting the images until specific criteria are met and a series of verification checks are successfully completed.

Initial Disclosure and Reactions

The existence of this technology was initially revealed by Matthew Green, a cryptography professor at Johns Hopkins University, via a series of tweets on Wednesday. The announcement prompted varied reactions, with some security experts and privacy advocates expressing concerns, while others acknowledged Apple’s longstanding commitment to security and privacy.

Privacy-Focused Design

Apple is actively addressing concerns by incorporating multiple layers of encryption into the system. This design necessitates several steps before any data reaches Apple for manual review, ensuring a robust privacy framework.

How NeuralHash Functions

NeuralHash, which will be integrated into iOS 15 and macOS Monterey in the coming months, functions by converting images on a user’s device into a unique alphanumeric string, known as a hash. Minor modifications to an image will alter the hash, preventing false matches. Apple asserts that NeuralHash is designed to generate the same hash for identical or visually similar images, such as those that have been cropped or edited.

Hash Matching and Private Set Intersection

Prior to uploading images to iCloud Photos, these hashes are compared against a database of known CSAM hashes provided by organizations like the National Center for Missing & Exploited Children (NCMEC). NeuralHash employs a cryptographic technique called private set intersection, enabling it to detect a hash match without revealing the image’s content or alerting the user.

Decryption and Reporting Process

The matching results are sent to Apple but remain undecrypted. Apple then utilizes threshold secret sharing, a cryptographic principle, to decrypt the content only if a user’s iCloud Photos contain a certain number of images flagged as potential CSAM. The specific threshold remains undisclosed, but Apple explained that decryption requires a portion of the split secret – for example, ten pieces out of a thousand.

Manual Verification and Law Enforcement

Upon reaching the threshold, Apple can decrypt the images, conduct a manual review, disable the user’s account, and report the imagery to NCMEC, which then forwards it to law enforcement. Apple emphasizes that this process is more privacy-respecting than cloud-based scanning, as NeuralHash only searches for known CSAM and not novel content. The company estimates a false positive rate of one in one trillion, with an appeals process available for mistakenly flagged accounts.

Technical Documentation and Support

Apple has published detailed technical documentation regarding NeuralHash on its website, which has been reviewed by cryptography experts and received positive feedback from child protection organizations.

Concerns Regarding Surveillance

Despite broad support for combating child sexual abuse, some individuals express discomfort with the potential for algorithmic surveillance and are advocating for greater public discussion before the technology is widely deployed.

Timing and External Pressures

Apple attributes the timing of this rollout to the recent development of privacy-preserving CSAM detection technology. However, the company has also faced increasing pressure from the U.S. government and its allies to weaken encryption or create backdoors for law enforcement investigations.

Balancing Security and Government Access

While tech companies have resisted creating backdoors in their systems, they have encountered resistance to efforts to completely restrict government access. Reuters reported last year that Apple abandoned plans to encrypt full phone backups to iCloud following objections from the FBI.

Potential for Abuse and Mitigation

Concerns have been raised about the potential for malicious actors to exploit the system by flooding victims with CSAM, leading to false account flags. Apple has addressed these concerns, stating that a manual review process will be implemented to investigate potential misuse.

Initial Rollout and International Considerations

Apple plans to initially launch NeuralHash in the United States, with no immediate plans for international expansion. Previously, companies like Facebook were compelled to disable their CSAM detection tools in the European Union due to inadvertent legal restrictions. Apple acknowledges that using iCloud Photos is optional, but it will be a requirement for users who choose to utilize the service.

#Apple#iCloud#child abuse#CSAM#privacy#scanning