LOGO

flawed data is putting people with disabilities at risk

AVATAR Cat Noone
Cat Noone
April 19, 2021
flawed data is putting people with disabilities at risk

The Real-World Impact of Data on Individuals

Data is not merely a theoretical concept; it profoundly influences the experiences of individuals in their daily lives.

An incident in 2019 highlighted this impact when an autonomous delivery robot temporarily obstructed a wheelchair user’s safe passage across a busy street. The individual rightly pointed out the necessity for technological development to avoid placing disabled individuals at risk.

Historical Disadvantages Faced by People with Disabilities

People with disabilities, alongside other marginalized communities, have consistently experienced harm resulting from inaccurate data and flawed data-driven tools.

Disabilities are characterized by their diversity, complexity, and evolving nature. This inherent variability often clashes with the rigid, pattern-seeking structure of AI systems.

Because artificial intelligence often dismisses atypical data points as irrelevant “noise,” individuals with disabilities are frequently excluded from the insights generated by these systems.

The Case of Elaine Herzberg

The tragic death of Elaine Herzberg in 2018, struck by a self-driving Uber vehicle while pushing a bicycle, serves as a stark example.

Uber’s system encountered difficulty accurately classifying Herzberg, fluctuating between identifying her as a “vehicle,” “bicycle,” or “other.”

This event prompted critical questions within the disability community regarding the potential risks faced by individuals using wheelchairs or scooters due to similar misclassifications.

The Need for a Revised Data Approach

A fundamental shift is required in how we gather and process data. This encompasses a wide range of information, including personal details, user feedback, employment applications, multimedia content, and user behavior metrics.

This data is continuously utilized to refine our software, yet often without sufficient consideration of the potential for misuse or the ethical implications at each stage of development.

A Call for a Fairer Data Framework

Current digital products urgently require a new, equitable data framework that prioritizes the needs of people with disabilities.

Without such a framework, individuals with disabilities will continue to encounter increased obstacles and potential dangers in a world that is becoming increasingly reliant on digital technologies.

Ensuring inclusivity in data management is not simply a matter of fairness, but a critical step towards creating a safer and more accessible future for all.

The Impact of Flawed Data on Inclusive Tool Development

The creation of effective tools is often hindered by inaccuracies within the data used to build them. While inaccessibility doesn't necessarily prevent individuals with disabilities from venturing out, it can significantly restrict their access to essential services like healthcare, education, and convenient delivery options.

The Cycle of Bias in Data Systems

The tools we develop are inherently shaped by the environments in which they are created. They reflect the perspectives and biases of their designers. A long-standing issue has been the oversight of data systems by a limited range of individuals, leading to a self-perpetuating cycle where existing biases are reinforced and marginalized groups remain overlooked.

As data analysis techniques advance, this cycle intensifies. Machine-learning models, when consistently presented with data that equates “not being X” – meaning not being white, able-bodied, or cisgender – with being “abnormal,” will build upon and amplify this flawed foundation.

Interconnectedness and Indirect Bias

Data sets are intricately linked in ways that are often not immediately apparent. Simply stating that an algorithm won't discriminate against individuals with registered disabilities is insufficient. Biases can be embedded within other data sources.

For instance, while it is illegal in the United States to deny a mortgage based on race, relying heavily on credit scores – which themselves contain inherent biases against people of color – can indirectly lead to exclusion.

Examples of Indirect Bias

For individuals with disabilities, seemingly unrelated data points like frequency of physical activity or weekly commute times can introduce indirect bias. Consider a hiring algorithm analyzing facial movements during video interviews; a candidate with a cognitive or mobility impairment will inevitably face different challenges than an able-bodied applicant.

The Exclusion of Disability in Market Research

A significant problem arises from the frequent omission of people with disabilities from businesses’ target market considerations. During the initial stages of user persona development, disabilities are often overlooked, particularly those that are less visible, such as mental health conditions.

Consequently, the initial user data used for product iteration doesn't accurately represent the needs of these individuals. Currently, a substantial 56% of organizations do not consistently test their digital products with users who have disabilities.

Towards More Inclusive Development

Proactively including individuals with disabilities within tech company teams would greatly increase the likelihood of creating a more representative target market. Furthermore, all technology professionals must be educated about, and actively account for, both visible and invisible exclusions present in their data.

Addressing this requires collaborative effort. Increased dialogue, forums, and knowledge-sharing initiatives are crucial for developing strategies to eliminate indirect bias from the data we utilize daily. Collaboration is key to building truly inclusive tools.

The Imperative of Ethical Data Stress Testing

Product testing is commonplace – encompassing usability, user engagement, and even preferences regarding visual elements like logos. We routinely determine which colors yield higher conversion rates and which phrasing most effectively connects with audiences. Given this commitment to optimization, why isn't a comparable standard being established for data ethics?

The onus of developing ethical technology isn't solely on leadership. Individuals directly involved in product development bear a corresponding responsibility. The case of Volkswagen illustrates this point; it was an engineer, not the CEO, who faced legal consequences for creating a device designed to circumvent U.S. emissions regulations.

Engineers, designers, and product managers must critically evaluate the data they encounter. A thorough examination of why data is collected and how it is obtained is essential. This necessitates a detailed analysis of requested data points and a clear understanding of underlying motivations.

Is it always necessary to inquire about an individual’s disabilities, gender, or ethnicity? How does possessing this information genuinely improve the user experience?

At Stark, a five-step framework has been created to guide the design and development of all software, services, and technological solutions. This framework requires addressing the following:

  • The specific data being gathered.
  • The rationale behind its collection.
  • How the data will be utilized, and potential avenues for misuse.
  • A simulation of "If This, Then That" scenarios. This involves outlining potential negative consequences of data misuse, such as large-scale breaches, and the impact of sensitive information becoming public.
  • A decision to proceed with the project or abandon it.

If data explanations rely on ambiguous language or require distortion of facts, its collection should be prohibited. This framework compels a simplified breakdown of data requirements. Inability to achieve this simplification indicates a lack of preparedness for responsible data handling.

Ultimately, transparency and clarity are paramount. If a justification for data collection cannot be articulated plainly and honestly, the practice should be reconsidered.

The Imperative of Inclusive Innovation for People with Disabilities

Advanced data technologies are rapidly expanding into diverse fields, encompassing areas like vaccine creation and autonomous vehicles. Any inherent biases against people with disabilities within these evolving sectors can prevent equitable access to the latest advancements in goods and services. As technology becomes increasingly integral to daily life, the potential for exclusion in routine activities grows.

Proactive inclusion, integrated from the outset of product creation, is paramount. Financial resources or extensive experience do not present barriers to this shift; rather, it requires a deliberate change in mindset and the development process – a change that is cost-free. The potential revenue losses from neglecting these markets, or the expenses associated with retroactive product modifications, ultimately surpass any initial investment in inclusive design.

Accessibility should be a core tenet for startups, woven into the product development lifecycle. Continuous reinforcement of these principles is achieved through the collection and analysis of user data. Disseminating insights across onboarding, sales, and design teams fosters a comprehensive understanding of user challenges.

Established companies should conduct thorough self-evaluations to identify gaps in their adherence to accessibility principles. Leveraging existing data alongside new user feedback will facilitate targeted improvements. This proactive approach ensures broader usability and market reach.

Beyond Technological Adaptation: The Need for Diverse Teams

Transforming AI and data practices extends beyond simply adjusting business models. A critical component is fostering greater diversity among the individuals leading these initiatives. These fields are currently characterized by a lack of representation, particularly concerning gender and racial diversity.

Furthermore, individuals with disabilities frequently report experiences of exclusion and bias within the technology sector. Until the teams responsible for developing data tools reflect a more inclusive demographic, national progress will be hampered, and people with disabilities will disproportionately bear the consequences.

  • Prioritizing inclusive design from the beginning is more cost-effective than retrofitting.
  • User data is crucial for reinforcing accessibility principles.
  • Diversity within development teams is essential for mitigating bias.