LOGO

UK Funds CSAM Detection Tech for Encrypted Messaging

September 8, 2021
UK Funds CSAM Detection Tech for Encrypted Messaging

U.K. Government Invests in CSAM Detection Technologies for Encrypted Platforms

The United Kingdom’s government is allocating over $584,000 to foster the creation of technologies designed to detect child sexual exploitation material (CSAM) within end-to-end encrypted messaging platforms. This initiative forms a key component of the government’s broader strategy concerning internet safety and the protection of children.

Tech Safety Challenge Fund Launched

A collaborative effort between the Home Office and the Department for Digital, Media, Culture and Sport (DCMS) has resulted in the establishment of a “Tech Safety Challenge Fund.”

This fund will distribute up to £425,000 (~$584,000) among five selected organizations, providing each with £85,000 ($117,000) to develop “innovative technology to keep children safe” within online messaging environments utilizing end-to-end encryption.

Maintaining User Privacy is a Priority

The program’s challenge statement emphasizes the importance of developing solutions that can operate effectively within E2E-encrypted environments without compromising user privacy.

A Home Office spokesperson explained that the core issue is the impediment to law enforcement caused by encryption. They argued that widespread adoption of full end-to-end encryption would significantly hinder efforts to protect children online.

Concerns Regarding Encryption and Law Enforcement

Home Secretary Priti Patel has previously voiced concerns about the expansion of end-to-end encryption by platforms like Facebook, warning that it could obstruct investigations into child abuse crimes.

WhatsApp, which already employs E2E encryption, is also considered a primary target for the technologies resulting from this government-funded challenge.

Other mainstream messaging services, including Apple’s iMessage and FaceTime, also utilize E2E encryption, suggesting a potentially broad application for any successful “child safety tech” developed through this initiative.

Independent Evaluation of Technologies

Technologies submitted for consideration will be assessed by independent academic experts. However, the Home Office has not yet disclosed the identities of these evaluators.

Pressure on Tech Sector and G7 Collaboration

Patel is actively applying pressure on the technology sector, seeking support from G7 counterparts to collectively urge social media companies to enhance their efforts in addressing harmful content.

In an opinion piece, Patel stated that the introduction of end-to-end encryption must not facilitate increased child sexual abuse. She refuted claims that the government’s intentions are related to surveillance of citizens, emphasizing the goal of protecting vulnerable individuals and preventing heinous crimes.

Apple’s CSAM Detection Tool

Patel highlighted Apple’s recent addition of a CSAM detection tool to iOS and macOS as a positive “first step.”

Apple claims this technology has a false positive rate of 1 in a trillion, safeguarding user privacy while identifying individuals involved in the creation of extensive collections of child sexual abuse material.

Concerns About Scanning Infrastructure

However, Apple delayed implementing the CSAM detection system following criticism from security experts and privacy advocates. Concerns were raised about vulnerabilities in the approach and the potential for governments to expand the scope of scanning to include content beyond CSAM.

Patel’s characterization of Apple’s move as merely a “first step” has fueled anxieties that once scanning infrastructure is integrated into E2E encrypted systems, it could become susceptible to governmental overreach.

Government Aims for “Middleground” Solutions

The Home Office spokesperson clarified that Patel’s comments regarding Apple’s CSAM tech were intended to acknowledge the company’s efforts in child safety, not necessarily an endorsement of the specific technology.

The government is seeking a range of solutions and is not attempting to encourage the creation of backdoors into E2E encryption, but rather ”middleground” solutions.

Past Proposals and Current Guidance

In the past, GCHQ proposed a “ghost protocol” allowing intelligence agencies to be invisibly copied on encrypted communications. This proposal faced widespread criticism.

Recent guidance from DCMS recommended that messaging platforms “prevent” the use of E2E encryption for child accounts.

The Home Office spokesperson indicated that the tech fund aims to find a balance between platform objectives and child protection.

Challenge Fund Details and Application Process

The Challenge is open to applicants globally, and further information is available on the Safety Tech Network website.

The application deadline is October 6th, with selected applicants having five months (November 2021 – March 2022) to complete their projects.

Government’s Broader Regulatory Efforts

This Challenge is part of a larger U.K. government effort to align platforms with its policy priorities. A draft Online Safety bill is currently undergoing parliamentary scrutiny.

The government’s data protection watchdog is also enforcing a children’s design code, requiring platforms to prioritize children’s privacy by default.

These initiatives suggest a growing confidence within the government that it is developing a successful blueprint for regulating technology giants.

The sustained focus on child safety is proving to be an effective political tool for achieving compliance from platforms.

#CSAM#child sexual abuse material#encryption#e2e encryption#UK government#tech funding