YouTube Removes 1 Million Videos for COVID-19 Misinformation

YouTube's Efforts to Combat COVID-19 Misinformation
Since February 2020, YouTube has taken down 1 million videos that contained dangerous and misleading information regarding COVID-19. This figure was revealed by Neal Mahon, YouTube’s Chief Product Officer.
Addressing the Spread of Misinformation
Mahon detailed the company’s strategies for tackling misinformation in a recent blog post. He noted a significant shift in the landscape of misinformation, stating that it has transitioned “from the marginal to the mainstream.”
Previously confined to niche communities, misinformation now permeates various aspects of society and can disseminate rapidly. This poses a considerable challenge to platforms like YouTube.
The Proportion of Harmful Content
Despite the large number of videos removed, the YouTube executive emphasized that problematic content constitutes a relatively small portion of the platform’s overall content.
Approximately 0.16 to 0.18 percent of all views on YouTube are attributed to videos that violate the platform’s policies. YouTube proactively removes nearly 10 million videos each quarter.
Notably, the majority of these removed videos receive minimal viewership, with most failing to even reach 10 views.
Similar Arguments from Facebook
Facebook recently presented a comparable argument concerning the content shared on its network. A recently published report indicated that the most frequently viewed posts consist of memes and other non-political material.
Facing criticism regarding its handling of COVID-19 and vaccine-related misinformation, Facebook has asserted that such misinformation does not accurately reflect the content typically seen by its users.
Scrutiny and Transparency Concerns
Both YouTube and Facebook have faced increased scrutiny regarding their policies on health misinformation, particularly during the pandemic.
With user bases exceeding one billion individuals, even a small percentage of misleading content can have a substantial impact. However, both platforms have been hesitant to release comprehensive data on the dissemination of vaccine and health misinformation, or the number of users exposed to it.
A Multi-faceted Approach
Mahon clarified that removing misinformation is just one component of YouTube’s broader strategy. The platform is also focused on amplifying information from credible sources.
Furthermore, efforts are underway to limit the reach of videos containing potentially harmful misinformation. This involves actively “ratcheting up information from trusted sources and reducing the spread of videos with harmful misinformation.”
Editor’s note: This article was originally published on Engadget.
Related Posts

Amazon Updates Copyright Protection for Kindle Direct Publishing

Figma AI: Remove Objects & Extend Images with New Tools

Pebble AI Smart Ring: Record Notes with a Button - $75

Spotify Now Offers Music Videos in the US & Canada | Spotify News

SoftBank, NVIDIA in Talks to Fund Skild AI at $14B Valuation
