LOGO

UK Enforces Children’s Privacy Design Code Compliance

September 1, 2021
UK Enforces Children’s Privacy Design Code Compliance

The U.K.’s Children’s Code: Compliance Deadline Arrives

A year-long period of grace for adhering to the Age Appropriate Design Code in the U.K. concludes today. This means that developers of applications and digital services anticipated to be used by individuals under the age of 18 are now expected to meet a set of standards designed to protect children from data tracking and profiling.

Background on the Children’s Code

The Age Appropriate Design Code, often referred to as the ‘Children’s Code’, initially took effect on September 2nd of the previous year. However, the U.K.’s data protection authority, the ICO, permitted a maximum grace period to allow organizations sufficient time to adapt their services.

Now, the ICO anticipates that these standards will be consistently upheld.

Scope of the Code’s Application

The code’s provisions extend to a wide range of services, encompassing connected toys, educational technology, and online games. It also applies to online retail platforms and profit-driven services like social media and video-sharing sites that are particularly appealing to younger users.

Key Stipulations of the Code

A core principle of the code is the application of “high privacy” settings by default when a user is, or is suspected to be, a child. This includes disabling geolocation and profiling features unless a strong justification exists for their activation.

Furthermore, the code mandates the provision of parental controls alongside age-appropriate information for children regarding these tools. It cautions against parental monitoring tools that operate secretly, without the child’s awareness.

Addressing Dark Patterns and Nudge Techniques

The code also specifically addresses manipulative design practices, known as “dark patterns.” App developers are warned against employing “nudge techniques” to encourage children to share unnecessary personal data or to weaken their privacy protections.

The Code’s Structure and Enforcement

The code comprises 15 standards, but it is not directly enshrined in law. Instead, it functions as a set of design recommendations that the ICO encourages app developers to follow.

The ICO is explicitly linking adherence to these children’s privacy standards with compliance with broader data protection laws already in effect in the U.K., providing a regulatory incentive for compliance.

Potential Risks of Non-Compliance

Applications that disregard the standards risk attracting the attention of the ICO, either through complaints or proactive investigations. This could lead to a comprehensive audit of their overall approach to privacy and data protection.

The ICO has stated its intention to monitor compliance through audits, consider complaints, and take appropriate enforcement action, in accordance with its Regulatory Action Policy.

ICO’s Warnings and Expectations

The ICO warns that failing to adhere to the code may hinder an organization’s ability to demonstrate compliance with the GDPR and PECR regulations.

Stephen Bonner, the ICO’s executive director, has emphasized that the ICO will proactively engage with social media platforms, video streaming services, and the gaming industry to assess their alignment with the code.

Focus on High-Risk Sectors

The ICO has identified social media, video and music streaming, and video gaming platforms as presenting the greatest risks to children’s data privacy. Concerns include the use of personal data to bombard children with content, inappropriate advertisements, and privacy-eroding prompts.

The ICO stresses that children’s rights must be respected and that organizations must prioritize their best interests.

Enforcement Powers of the ICO

The ICO possesses substantial enforcement powers, including the ability to impose fines of up to £17.5 million or 4% of an organization’s annual worldwide turnover under GDPR. It can also issue orders to halt data processing or require changes to non-compliant services.

Industry Response and Recent Changes

In recent months, several major platforms, including Instagram, YouTube, and TikTok, have announced changes to their handling of minors’ data and account settings in anticipation of the compliance deadline.

Instagram has defaulted teen accounts to private, while Google has implemented similar changes for YouTube accounts. TikTok has also added enhanced privacy protections for teenagers.

Apple and Child Safety Features

Apple has also faced scrutiny regarding its child safety features, including a CSAM detection tool for iCloud and an opt-in parental safety feature for Messages.

Global Impact and Influence

The U.K.’s proactive approach to online child safety is gaining international attention. U.S. lawmakers have called for American tech companies to voluntarily adopt the ICO’s standards, and Ireland is preparing to introduce similar principles.

Complementary Legislation and Future Developments

The code complements incoming U.K. legislation that will impose a “duty of care” on platforms to prioritize user safety, with a particular focus on children.

The ICO plans to publish its position on age assurance this autumn, providing further guidance to organizations on this complex issue.

Challenges and Potential Conflicts

Despite the progress, challenges remain. The government’s broader push for online safety may conflict with the code’s emphasis on data minimization. For example, lawmakers have suggested that social media platforms should prevent children from using end-to-end encryption, potentially compromising their privacy.

This creates a tension between protecting children and respecting their right to privacy and security.

Navigating these conflicting requirements will be a significant challenge for digital services operating in the U.K.

#children's privacy#UK privacy#design code#data protection#online safety