LOGO

UK Cracks Down on Deepfake Pornography | Internet Watchdog

February 25, 2025
UK Cracks Down on Deepfake Pornography | Internet Watchdog

Ofcom’s New Guidance on Online Safety Act

Ofcom, the U.K.'s regulatory body for internet safety, has released further draft guidance. This ongoing effort supports the implementation of the Online Safety Act (OSA).

The newest recommendations are designed to assist companies covered by the Act in fulfilling their legal duties. These duties focus on safeguarding women and girls from online harms, including harassment, bullying, misogyny, and the non-consensual sharing of intimate images.

Government Priorities and Enforcement

The government has explicitly stated that protecting women and girls is a key priority during the OSA’s implementation. Specific types of abusive behavior, particularly those rooted in misogyny, are highlighted within the legislation.

Examples of these prioritized offenses include the distribution of intimate images without permission and the use of AI to generate deepfake pornography targeting individuals.

Criticism and Concerns Regarding the OSA

Despite the substantial penalties for non-compliance – potentially reaching 10% of a company’s global annual revenue – the Online Safety Regulation has received criticism. Some argue it may not be sufficient to effectively reform major platforms.

Concerns have also been raised regarding the timeframe for implementation. Child safety advocates have voiced frustration and questioned whether the Act will achieve its intended outcomes.

Acknowledged Imperfections and Implementation Timeline

Even Technology Minister Peter Kyle acknowledged the legislation as “very uneven” and “unsatisfactory” in a January interview with the BBC. Nevertheless, the government remains committed to its current approach.

A contributing factor to the discontent surrounding the OSA is the extended period ministers allowed for its implementation. This process necessitates parliamentary approval of Ofcom’s compliance guidance.

Enforcement concerning core requirements related to illegal content and child protection is anticipated to begin shortly. However, full compliance with other aspects of the OSA will require a longer timeframe.

Ofcom acknowledges that this latest set of recommendations will not be fully enforceable until 2027 or beyond.

The Initial Phase of Online Safety Act Enforcement

According to Jessica Smith of Ofcom, who spearheaded the creation of guidance centered on female safety, the first obligations stipulated by the Online Safety Act will be enacted next month. She informed TechCrunch that enforcement of key duties within the Act will commence even before the guidance itself becomes legally binding.

This newly released draft guidance focuses on ensuring the safety of women and girls online, and is designed to build upon existing, more general Ofcom guidance concerning unlawful content. This broader guidance, for instance, offers suggestions for shielding minors from exposure to adult material on the internet.

Prior Guidance on Illegal Content

In December, the regulatory body released its definitive guidance detailing how platforms and services should mitigate risks associated with illegal content. A primary focus within this area is the protection of children.

Previously, a Children’s Safety Code was also issued, advocating for increased age verification and content filtering measures by online services. This aims to prevent children from encountering unsuitable content, such as pornography.

Furthermore, as the online safety framework has been developed, recommendations for age assurance technologies have been created for websites featuring adult content. The goal is to encourage these sites to implement robust measures to prevent underage access to inappropriate material.

Collaboration and Key Areas of Focus

The current guidance was formulated in collaboration with victims, survivors, women’s rights organizations, and safety specialists, as confirmed by Ofcom. It addresses four primary areas where females experience a disproportionate impact from online harm:

  • Online misogyny
  • Pile-ons and online harassment
  • Online domestic abuse
  • Intimate image abuse

These areas represent significant concerns that the Online Safety Act seeks to address through proactive regulation and enforcement.

Safety by Design: A Proactive Approach

Ofcom’s primary recommendation centers around a “safety by design” methodology for services and platforms falling within the scope of the regulations. According to Smith, the regulator aims to motivate technology companies to reassess their user experience comprehensively. While acknowledging existing measures implemented by some services to mitigate online risks, she contends that a holistic approach to prioritizing the safety of women and girls remains insufficient.

“We are essentially requesting a significant shift in the design process,” she explained, emphasizing the importance of integrating safety considerations directly into product development.

The proliferation of image-generating AI services was cited as a key example, with Smith noting the substantial increase in deepfake intimate image abuse. She pointed out that technologists could have proactively minimized the potential for their tools to be misused against women and girls, but failed to do so.

“We believe that practical steps can be taken during the design phase to address the risk of such harms,” she stated.

Ofcom’s guidance highlights several examples of commendable industry practices, including:

  • Default removal of geolocation data (to reduce privacy and stalking risks);
  • Implementation of “abusability” testing to identify potential misuse scenarios;
  • Strengthening account security measures;
  • Incorporating user prompts designed to encourage thoughtful content posting;
  • Providing easily accessible reporting mechanisms for users to flag issues.

As with all of Ofcom’s Online Safety Act (OSA) guidance, not all measures will be applicable to every service, regardless of size. The legislation applies to a wide range of online services, from social media and online dating to gaming, forums, and messaging applications. Therefore, a crucial task for affected companies will be determining what compliance entails within the context of their specific product.

When questioned about whether Ofcom had identified any services currently adhering to the guidance’s standards, Smith indicated that none had been found. “Significant work remains to be done across the industry,” she affirmed.

She also implicitly recognized potential emerging challenges stemming from recent decisions made by certain major industry players regarding trust and safety. The changes implemented by Elon Musk after acquiring Twitter, and subsequently rebranding it as X – including substantial reductions in trust and safety personnel – were mentioned as an example, in favor of a more expansive interpretation of free speech.

Recently, Meta, the parent company of Facebook and Instagram, has seemingly adopted similar measures, announcing the termination of contracts with third-party fact-checkers. This move is in favor of implementing a crowdsourced labeling system, akin to X’s “community notes,” to address content disputes.

Transparency Initiatives by Ofcom

According to Smith, Ofcom’s approach to significant changes in operator behavior – specifically, actions that could potentially exacerbate online harms – will center on leveraging its transparency and information-gathering authorities as outlined in the Online Safety Act (OSA).

The primary objective is to demonstrate the consequences of these actions and enhance user understanding.

Initially, the strategy appears to be one of public accountability, often referred to as “name and shame.”

“Following the finalization of our guidance, a comprehensive report will be published,” Smith explained. “This report will detail which entities are utilizing the guidance, the specific measures they are implementing, and the resulting benefits for female and girl users.”

Ofcom intends to highlight the safety measures in place across various platforms, empowering users to make well-informed decisions regarding their online activity.

Companies seeking to avoid negative publicity due to inadequate safety measures for women will find Ofcom’s guidance offers “practical steps” for improvement.

This proactive approach also addresses the potential for reputational damage.

“All platforms operating within the U.K. are legally obligated to adhere to U.K. legislation,” Smith emphasized, particularly concerning the duties related to illegal harms and the protection of children as stipulated by the Online Safety Act.

“Our transparency powers will be crucial in these situations,” she continued. “Should the industry shift in a direction that leads to increased harms, we will illuminate these trends and disseminate pertinent information to U.K. users, the media, and members of parliament.”

Utilizing Transparency Powers

  • Information Disclosure: Ofcom will share data on platform compliance with safety guidelines.
  • Impact Assessment: Reports will analyze the effects of platform actions on user safety, especially for women and girls.
  • Public Awareness: The goal is to empower users with knowledge to make informed choices.

This approach aims to hold platforms accountable and encourage the prioritization of user safety.

Addressing the Threat of Deepfake Pornography

Ofcom is strengthening its guidance regarding online safety, particularly concerning intimate image abuse, even prior to the full enforcement of the Online Safety Act (OSA). The latest draft guidance proposes the implementation of hash matching technology for the detection and removal of abusive imagery.

This represents a significant evolution from previous recommendations, as earlier Ofcom suggestions did not extend to this level of proactive measure, according to Smith.

“We are including supplementary procedures within this guidance that surpass our existing codes,” Smith stated, confirming Ofcom’s intention to revise its earlier codes to reflect this change “in the coming months.”

Hash Matching for Enhanced Protection

The recommendation to utilize hash matching technology is a direct response to the escalating prevalence of intimate image abuse. This increase is particularly notable in the context of AI-generated deepfake image abuse.

“The number of reported cases involving deepfake intimate image abuse in 2023 exceeded the total from all preceding years combined,” Smith explained, further noting the accumulating evidence supporting the efficacy of hash matching in mitigating this specific harm.

Ofcom’s guidance is now subject to a consultation period, accepting feedback until May 23, 2025. Following this, the final guidance is expected to be published by the end of the current year.

Timeline for Implementation and Review

The first report assessing industry practices in this area will be released 18 months after the final guidance, placing the anticipated publication date in 2027.

“The initial report evaluating industry actions [to safeguard women and girls online] is scheduled for 2027, but platforms are not obligated to wait before taking action,” she emphasized.

Responding to concerns about the protracted implementation of the OSA, Smith defended the regulator’s approach, highlighting the importance of thorough consultation on compliance measures.

However, she also indicated that the imminent enactment of the final measures will catalyze a shift in dialogue with platforms.

“This will fundamentally alter the discussions with platforms,” she projected, adding that Ofcom will soon be able to demonstrate tangible progress in reducing online harms.

  • Intimate image abuse is a key focus of Ofcom’s updated guidance.
  • Hash matching is recommended as a proactive detection method.
  • Deepfake technology is significantly increasing the risk.
#deepfake porn#UK internet watchdog#online safety#digital crime#non-consensual imagery