LOGO

Facebook Tests Pop-Up to Encourage Link Reading Before Sharing

May 10, 2021
Facebook Tests Pop-Up to Encourage Link Reading Before Sharing

Reshaping User Behavior on Social Media Platforms

Following a period marked by concerns regarding problematic content, social media organizations are now exploring nuanced methods to influence user interaction with their services.

Mirroring a move previously implemented by Twitter, Facebook is currently evaluating a new functionality intended to promote the practice of reading articles before they are shared. This testing phase will initially encompass 6% of Facebook’s Android user base worldwide, with a phased implementation designed to foster more “informed sharing” of news content.

Introducing Friction to Encourage Consideration

While sharing articles will remain readily accessible, the intention is that introducing a slight impediment to the process may prompt users to reconsider impulsive sharing, particularly concerning potentially inflammatory material that is prevalent on the platform.

Last June, Twitter initiated the deployment of prompts encouraging users to review a link prior to retweeting it. The company observed positive results from this test feature and subsequently expanded its availability to a larger audience.

Expanding Prompt-Based Interventions

Facebook initiated trials of comparable prompts last year. In June, the platform introduced pop-up notifications to alert users before sharing content exceeding 90 days in age, aiming to reduce the spread of misleading information presented without its original context.

Facebook indicated at the time that it was investigating additional pop-up prompts to mitigate various forms of misinformation. Subsequently, similar notifications were implemented, displaying the date and source of any shared links pertaining to COVID-19.

A Passive Approach to Information Integrity

This approach highlights Facebook’s inclination towards a subtle strategy of guiding users away from misinformation and towards its own validated resources on sensitive topics, such as COVID-19 and the 2020 election.

The effectiveness of this type of gentle behavioral modification in addressing the misinformation crisis remains to be fully determined. However, both Twitter and Facebook have also investigated prompts designed to discourage the posting of abusive comments.

The Future of Moderation

Pop-up messages that convey to users that their actions are being monitored may represent a future direction for automated moderation on social platforms.

Although users might benefit more from social media companies completely overhauling their existing platforms—which are often rife with misinformation and abuse—and rebuilding them with more careful consideration, incremental behavioral adjustments are currently the most feasible solution.

#facebook#social media#sharing#links#misinformation#fact checking