LOGO

tackling deep-seated bias in tech with haben girma, mutale nkonde and safiya noble

AVATAR Devin Coldewey
Devin Coldewey
Writer & Photographer, TechCrunch
March 10, 2021
tackling deep-seated bias in tech with haben girma, mutale nkonde and safiya noble

The Pervasive Nature of Bias in Technology

Technological advancements, while offering numerous advantages, also present inherent risks – particularly for communities already facing marginalization. Researchers like Mutale Nkonde of AI for the People, disability rights advocate Haben Girma, and author Safiya Umoja Noble, known for her work “Algorithms of Oppression,” have extensively studied and documented these dangers. They participated in TC Sessions: Justice 2021 to discuss the origins and consequences of bias within the tech industry, and potential solutions.

Distinguishing Bias in Technology from Bias in Individuals

When addressing bias in technology, it’s crucial to differentiate between the technology itself and the individuals who implement it. A facial recognition system might exhibit inherent biases – for example, performing inadequately with darker skin tones – or it could be utilized to support prejudiced practices, such as discriminatory stop-and-frisk policies.

  • Algorithmic bias concerns within Twitter and Zoom
  • The dismissal of Margaret Mitchell, a leading AI ethics researcher at Google
  • The future trajectory of Dr. Timnit Gebru’s work

The Risks Posed by Seemingly Harmless Technologies

Bias isn't limited to controversial technologies like facial recognition. Commonly used tools such as search engines and algorithmic news feeds, often taken for granted, can also harbor or contribute to harmful biases.

  • Google’s recent design alterations blurring the lines between advertisements and search results
  • Google’s threat to withdraw its search engine from Australia amidst lobbying efforts against the digital news code

The Double Disadvantage of Exclusion

Haben Girma, a deaf and blind advocate who honed her skills at Harvard Law, emphasizes that a lack of accessibility extends beyond simple fixes like image captioning.

Recently, a concerning trend emerged on TikTok, casting doubt on the life and accomplishments of Helen Keller, a deafblind icon. This skepticism spread widely on the platform, and due to TikTok’s accessibility limitations, individuals like Keller were further excluded from the discussion and subjected to false narratives.

  • The Biden administration’s website refresh, prioritizing accessibility with a dark mode option
  • Evinced secures $17M in funding to accelerate web accessibility testing

Technology Used to Target Black Communities

Conversely, existing biases within institutions can be amplified through the deployment of seemingly objective technologies. When law enforcement utilizes tools like license plate readers or biometric checks, their inherent systemic biases and problematic objectives are often perpetuated.

The combination of these two forms of bias creates significant disadvantages for specific groups:

  • The dual role of technology as both a revolutionary asset and a potential liability within Seattle’s cop-free protest zone
  • New data revealing racial disparities in the use of force by Chicago police officers

Financial and Legal Repercussions of Ignoring Bias and Diversity

While ethical considerations should be paramount, a financial incentive can also accelerate progress. Companies now face increasing legal and financial liabilities for failing to address these issues. For example, an AI solution found to be significantly biased could lead to business losses and potential civil or governmental lawsuits.

It’s also vital to consider unintended consequences – how an application or service might be misused in ways the creators didn’t anticipate.

  • Discussions surrounding DE&I at Facebook, Prop 22, and gig worker earnings
  • Facebook’s repeated fines in Italy for misleading users regarding data usage

Prioritizing Inclusive Design from the Outset

The practice of releasing a product and then attempting to address issues later, rather than incorporating accessibility from the beginning, is being increasingly challenged by both advocates and developers. Integrating accessibility from the ground up proves to be more beneficial for everyone involved, and ultimately more cost-effective.

  • Microsoft’s introduction of a new accessibility testing service for PC and Xbox games
  • Fable’s initiative to simplify disability-inclusive design through a service-based approach

A complete transcript of the discussion is available here.

TechCrunch Sessions: Justice – Related Discussions

  • A Discussion on Designing for Accessibility

Early Stage represents a leading event focused on providing practical guidance for both startup founders and investors.

Attendees gain direct insights into the strategies employed by highly successful entrepreneurs and venture capitalists to establish and grow their ventures, secure funding, and oversee investment portfolios.

The event comprehensively addresses all facets of company development, including fundraising techniques, talent acquisition, sales methodologies, achieving product-market fit, public relations, marketing strategies, and brand establishment.

Each session is structured to encourage engagement, with dedicated time allocated for questions from the audience and open discussion.

A 20% discount on tickets is available using the code “TCARTICLE” during the checkout process.

#tech bias#artificial intelligence#AI ethics#diversity in tech#inclusion#Haben Girma

Devin Coldewey

Devin Coldewey: A Profile

Devin Coldewey is a professional writer and photographer currently residing in Seattle.

Background and Expertise

He focuses his creative efforts on both written content and visual media.

Online Presence

Individuals interested in viewing his work can find more information on his personal website.

The website address is coldewey.cc.

This online platform serves as a portfolio showcasing his skills and projects.

Devin Coldewey