On March 3, 2020, discussions around duplication scams and misinformation about the coronavirus drew sharp attention from Facebook. The company stressed that sharing content that could endanger people is unacceptable, a stance echoed by founder and CEO Mark Zuckerberg. More than two years later, the platform has remained cautious about loosening restrictions, especially as the pandemic and perceptions around it continue to shape online discourse. The hesitation reflects a broader concern: misinformation related to covid and how to balance free expression with public health.
Finding the right path is challenging because the health crisis affected countries differently. In response, the social network turned to its supervisory board, a panel of experts established in 2020, to help guide decisions. Nick Clegg, a senior executive, explained that the board was asked for a consultative opinion on whether current measures are still appropriate and whether it is right to address covid-19 misinformation as part of the platform’s health-harming misinformation policy. The aim is to ensure responses align with evolving health guidance while preserving user trust.
Meta’s moderation approach has consistently aimed to curb coronavirus related hoaxes. The policy addresses conspiracy theories that deny the existence of covid or portray the virus as part of a malevolent plan, and it considers regional differences in how misinformation spreads. To limit harm, the platform has tagged incorrect posts and directed users to reliable information from credible sources such as the World Health Organization. In addition, advertising that seeks to profit from the pandemic has been restricted or banned in many markets.
non-binding opinion
The Facebook Oversight Board operates as an independent body established in 2020 to advise on difficult content decisions, including cases involving high-profile figures and accounts. While the board’s recommendations carry weight, the advisory nature of the opinion means Facebook is not obliged to implement every decision. Critics argue this can blur accountability, suggesting the process serves as a shield for controversial outcomes rather than a true governance mechanism.
Final decisions on the board’s guidance may require more time. Meta has indicated that a response window exists, with up to 60 days after publication for the company to address the board’s recommendations. This timeline reflects the need to evaluate complex considerations, including safety, freedom of expression, and consistency with the platform’s broader policies.