LOGO

Digital Inequality: Liberty, Privacy, and the Widening Gap

August 10, 2021
Digital Inequality: Liberty, Privacy, and the Widening Gap

The Emotional Core of Privacy and Legal Frameworks

Privacy often resonates on a deeply personal level, becoming particularly important during times of vulnerability or when individuals encounter unsettling data handling practices. However, legal interpretations frequently do not recognize emotional responses as sufficient grounds for legal redress or systemic changes to privacy laws.

Significant advancements in U.S. privacy protections may require a focus on the tangible consequences of growing privacy inequalities and their connection to broader societal disparities.

Apple’s App Tracking Transparency (ATT) and its Impact

In 2020, Apple unveiled its plans for the App Tracking Transparency (ATT) feature. This update empowers iOS users to deny applications the ability to monitor their activity across different apps and websites.

The implementation of ATT has resulted in a substantial shift in user behavior, with approximately 75% of iOS users choosing to disable cross-app tracking.

Shifting Advertising Spend

The reduced availability of data for advertisers aiming to create detailed user profiles for targeted advertising has diminished the effectiveness and attractiveness of advertising on iOS devices.

Recent data indicates a decline in advertising expenditure on iOS platforms, with advertisers now spending roughly one-third less.

This capital is being reallocated towards advertising on Android systems, which currently hold a 42.06% share of the mobile operating system market, compared to iOS’s 57.62%.

The Material Risks of Privacy Disparities

Beyond a general feeling of unease, inequalities in privacy protection are increasingly creating tangible risks, including emotional distress, damage to reputation, and financial losses.

If privacy is truly a universal right, as many technology companies assert, why is access to it often so costly?

When one group of users strengthens its privacy defenses, companies frequently adapt by shifting their data collection efforts towards populations with limited resources – be they legal, financial, or technical – to effectively manage their personal information.

Beyond Advertising Revenue

With increasing financial investment in Android advertising, a rise in both the complexity and intensity of advertising methods is anticipated. While targeted advertising itself isn't unlawful, it must adhere to legal stipulations allowing users to decline data collection, as defined by regulations like the CCPA in California.

This situation presents two key concerns. Firstly, individuals residing in all states excluding California currently lack comparable opt-out provisions. Secondly, acknowledging an opt-out right for certain users inherently suggests potential drawbacks or risks associated with targeted advertising practices.

The process of targeted advertising necessitates third-party entities constructing and maintaining user profiles based on observed behaviors. Collecting information regarding app usage, such as exercise routines or purchasing habits, can result in deductions concerning private details of an individual’s life.

Consequently, a digital representation of the user is formed within a largely unregulated data ecosystem. This representation contains data – accurately or inaccurately inferred – that the user has not explicitly consented to share. (This is particularly true for those outside of California.)

Moreover, studies indicate that detailed user profiling through targeted advertising can contribute to discriminatory practices in areas like housing and employment, potentially violating federal legislation. It can also restrict individual agency by limiting choices before a conscious decision is made, even when unwanted.

However, targeted advertising can also benefit smaller organizations and community groups by facilitating direct connections with relevant audiences. Despite differing opinions on targeted advertising, the fundamental issue lies in the absence of user control over its application.

Targeted advertising represents a substantial and rapidly expanding industry, yet it is merely one component of a wider range of commercial activities that do not prioritize user data privacy. These practices are not currently prohibited by law in many parts of the U.S., meaning financial prudence can be a strong defense.

Privacy: An Increasingly Exclusive Benefit

Leading technology corporations, notably Apple, position privacy as a fundamental human right, a strategy that aligns well with their commercial interests. Given the current lack of comprehensive federal privacy legislation in the United States, a strong privacy pledge from a private entity is understandably attractive to consumers.

The sentiment “If governmental bodies are unwilling to establish a privacy benchmark, at least my mobile device provider will” is increasingly common. Despite the fact that a minority – only 6% – of Americans fully grasp how their data is utilized by companies, it is these same companies that are spearheading significant privacy initiatives.

However, if the assertion of privacy as a human right is limited to products accessible only to a select few, what implications does this have for the universality of human rights? Apple’s customer base tends to be comprised of individuals with higher incomes and educational attainment when contrasted with those of its competitors.

This trend suggests a potentially concerning future where privacy inequalities between different socioeconomic groups are amplified. A cyclical pattern could emerge: individuals with limited financial means to secure privacy safeguards may also lack the resources to effectively address the complex technical and legal issues associated with practices like targeted advertising.

It is important to clarify that this is not an endorsement of Facebook’s position in its dispute with Apple regarding privacy versus cost (particularly in light of recent revelations concerning systemic access control vulnerabilities). In this instance, neither party appears to be achieving a favorable outcome.

Truly effective privacy safeguards should be accessible to all. Indeed, to rephrase the concept, meaningful privacy protections should be an indispensable component of all products, a cost no company can justify avoiding. A combined approach is necessary – privacy that is both robust and broadly available.

Charting the Course for Future Privacy

Progress in privacy hinges on advancements in two crucial areas: the enactment of robust privacy legislation and the provision of accessible privacy tooling for developers. A combined approach is essential. Lawmakers, not technology companies, should establish dependable privacy benchmarks for consumers.

Simultaneously, developers require readily available tools that eliminate any impediment – be it financial or logistical – to integrating privacy directly into product development.

Insights from Privacy Policy Experts

Regarding privacy legislation, numerous policy professionals are already contributing valuable perspectives. I will direct your attention to some recent publications that I find particularly insightful.

Stacey Gray and her colleagues at the Future of Privacy Forum have initiated a compelling blog series exploring the potential interplay between a federal privacy law and the evolving landscape of state-level regulations.

Joe Jerome recently delivered a comprehensive overview of the 2021 state privacy environment and the pathways toward comprehensive privacy protections for all citizens. A central point is that the efficacy of privacy regulation depends on its seamless coordination between individuals and organizations.

This does not imply that regulations should prioritize business interests, but rather that businesses should have access to unambiguous privacy standards, enabling them to manage personal data with confidence and respect.

The Importance of Developer Privacy Tools

Concerning privacy tooling, making these resources easily accessible and affordable for all developers removes any justifiable excuse for failing to meet privacy standards. Consider the challenge of access control, for example.

Engineers frequently grapple with creating manual controls to govern data access for both personnel and end-users within intricate data ecosystems containing sensitive personal information.

This presents a dual challenge. Firstly, existing technical debt often hinders progress, as privacy considerations have historically been absent from software development processes. Developers need tools that facilitate the implementation of privacy features, such as granular access control, before a product reaches production.

Secondly, even if engineers were to resolve all existing technical debt and implement structural privacy enhancements at the code level, what standardized tools are available for them to utilize?

A report from the Future of Privacy Forum, published in June 2021, highlights the critical need for consistent definitions within privacy technology, a prerequisite for the widespread adoption of reliable privacy tools.

With clearer definitions and widely available developer tools, these technical improvements will translate into tangible gains in how the technology sector – beyond any single brand – empowers users to control their data.

Regulation and Tooling: A Synergistic Approach

We require privacy regulations established by an impartial body, one not directly involved in the practices it governs. While regulation alone cannot resolve the challenges of modern privacy, it remains a crucial component of any effective solution.

Alongside regulation, every software engineering team should have immediate access to privacy tools. Just as civil engineers must design bridges to be safe for all users, our data infrastructure must be inclusive and equitable, preventing the exacerbation of disparities in the digital world.

#digital inequality#privacy#liberty#digital divide#inequality#technology