LOGO

Apple Delays CSAM Detection in iOS 15 - Privacy Concerns

September 3, 2021
Apple Delays CSAM Detection in iOS 15 - Privacy Concerns

Apple Pauses Rollout of CSAM Detection Technology

Apple has postponed the implementation of its technology designed to detect child sexual abuse material (CSAM). This decision follows a turbulent announcement made last month and is a direct response to concerns raised by both users and advocacy organizations.

The response to the proposed technology has been overwhelmingly critical. The Electronic Frontier Foundation reported gathering over 25,000 signatures in opposition. Furthermore, nearly 100 policy and rights groups, including the American Civil Liberties Union, have urged Apple to reconsider its plans.

Apple's Explanation

In a statement released to TechCrunch on Friday, Apple addressed the situation. The company’s NeuralHash technology aims to identify known CSAM on a user’s device without directly accessing or analyzing the image content itself.

Apple maintains that because photos stored in iCloud are encrypted – though not with end-to-end encryption – NeuralHash offers a more privacy-conscious approach. It scans devices for known CSAM, contrasting with the broader scanning practices employed by other cloud service providers.

Concerns Regarding Potential Abuse

However, security professionals and privacy advocates have voiced significant concerns. They suggest the system could be exploited by powerful entities, such as governments, to falsely implicate individuals or to broaden the scope of detection to include materials deemed objectionable by authoritarian regimes.

Shortly after the technology’s unveiling, researchers demonstrated the possibility of creating “hash collisions.” This means they were able to manipulate the system into misidentifying distinct images as identical.

Looking Ahead

The release of iOS 15 is still anticipated within the coming weeks.

Note: This article has been updated to provide further details regarding NeuralHash and to clarify the encryption status of iCloud Photos.

#apple#ios 15#csam#child sexual abuse material#privacy#delay