LOGO

Digital Regulation: Empowering Users for a Better Internet

November 29, 2021
Digital Regulation: Empowering Users for a Better Internet

The Rise of Reliable Information During the COVID-19 Pandemic

As the COVID-19 virus rapidly disseminated globally in 2020, a widespread demand for trustworthy information emerged. In response, a global collective of volunteers stepped forward to address this need.

This network consolidated data originating from scientists, journalists, and medical experts, subsequently making it readily available to the general public. Dr. Alaa Najjar, a medical doctor and Wikipedia volunteer, exemplifies this dedication, utilizing breaks during his emergency room shifts to combat misinformation on the Arabic Wikipedia.

Volunteer Efforts and Wikipedia's Role

Similarly, Dr. Netha Hussain, a clinical neuroscientist based in Sweden, devoted her free time to editing COVID-19 related articles in both English and Malayalam – a language spoken in southwestern India.

Her focus later shifted towards enhancing Wikipedia articles concerning COVID-19 vaccines. Through the contributions of Najjar, Hussain, and over 280,000 other volunteers, Wikipedia established itself as a highly trusted resource.

The platform provided up-to-date and comprehensive knowledge about COVID-19, encompassing nearly 7,000 articles across 188 languages. Wikipedia’s extensive reach and capacity to facilitate global knowledge-sharing are fundamentally enabled by legal frameworks that support its collaborative, volunteer-driven structure.

The Digital Services Act and its Potential Impact

Currently, the European Parliament is deliberating new regulations, such as the Digital Services Act (DSA), designed to hold large technology platforms accountable for illegal content disseminated through their services.

It is crucial that these regulations safeguard the public’s ability to collaborate in the pursuit of the public good. Lawmakers rightly aim to curtail the proliferation of content that causes harm, whether physical or psychological, including content deemed illegal in numerous jurisdictions.

Balancing Accountability and Collaboration

We acknowledge and support certain proposed elements within the DSA, particularly those pertaining to increased transparency in platform content moderation practices. However, the current draft incorporates prescriptive requirements regarding the enforcement of terms of service.

While seemingly necessary to address the growing influence of social media and ensure online safety, these measures could inadvertently disadvantage projects like Wikipedia. Some proposed stipulations risk transferring power from individuals to platform providers, potentially hindering digital platforms that diverge from large commercial models.

Fundamental Differences Between Platforms

Big Tech platforms operate under fundamentally different principles than nonprofit, collaborative websites such as Wikipedia. All content created by Wikipedia volunteers is freely accessible, devoid of advertisements, and does not track user browsing activity.

Commercial platforms prioritize maximizing profits and user engagement through algorithms that leverage detailed user profiles to deliver targeted content. These platforms also employ automated content moderation systems, which are prone to errors in both over- and under-enforcement.

For instance, automated systems frequently misidentify artwork or satire as illegal content, while simultaneously failing to grasp the human nuance and contextual understanding necessary for effective rule enforcement.

Wikipedia’s Volunteer-Driven Model

The Wikimedia Foundation, along with affiliated organizations like Wikimedia Deutschland, provides support to Wikipedia volunteers and upholds their autonomy in determining the content that resides on the platform.

Wikipedia’s open editing model is predicated on the belief that people should collectively decide what information is included, guided by established volunteer-developed guidelines for neutrality and reliable sourcing.

This approach ensures that individuals with expertise and a vested interest in a particular subject matter are responsible for enforcing content rules on its corresponding Wikipedia article. Furthermore, our content moderation process is transparent and accountable, with all editor conversations publicly accessible.

Protecting the Public Interest

While not flawless, this system has proven largely successful in establishing Wikipedia as a global source of neutral and verified information. Imposing operational constraints on Wikipedia, forcing it to emulate a commercial platform with a centralized power structure and diminished accountability, would likely undermine the DSA’s intended public interest goals.

Such a change would effectively exclude our communities from crucial decisions regarding content. The internet currently stands at a critical juncture, with democracy and civic space facing challenges globally.

A Call for Inclusive Regulations

Now, more than ever, careful consideration must be given to how new regulations will either foster or impede an online environment conducive to culture, science, participation, and knowledge. Lawmakers should actively engage with public interest communities like ours.

This collaboration will facilitate the development of standards and principles that are more inclusive, enforceable, and effective. However, regulations should not be solely tailored to the most powerful commercial internet platforms.

A better and safer internet is a shared aspiration. We urge lawmakers to collaborate with diverse stakeholders, including Wikimedia, to design regulations that empower citizens to collectively improve the online experience.

#digital regulation#internet governance#user empowerment#online safety#digital rights