As the COVID-19 Delta variant has increased infection rates worldwide and has threatened to triple the death toll related to COVID, Facebook has again been put in the crosshairs for health misinformation on its platform. Multiple commenters and reports have noted the prevalence of COVID-19 and vaccination misinformation on Facebook and the high rate at which this misinformation has been viewed. This recently culminated in a dispute with the White House where President Biden accused Facebook of “killing people” with its COVID-19 misinformation. While President Biden eventually clarified that his statements were more directly aimed at those posting misinformation on Facebook than Facebook itself, the whole affair has again underscored the difficulty that Facebook has had monitoring controversial speech, especially speech that could affect the general health and welfare of its users.
Non-profits, politicians, and policy advocates have often argued for amended moderation standards for Facebook. Republicans and conservative non-profits have often stated that Facebook’s moderation of content, especially conservative content, amounts to censorship, whereas Democrats have often asked for significantly more content moderation. In fact, Democrats have now discussed removing Facebook’s Section 230 safe harbor if Facebook refuses to enact significant protections against health misinformation.
In an effort to help mitigate these controversies and make difficult decisions, Facebook created an independent Oversight Board in 2020 which was made up of independent members who had the authority to review cases and make final decisions on content moderation. Originally met with significant fanfare, commentators argued that the Oversight Board offers an independent redress system which will provide due process and fair judgement on difficult speech issues.
In January 2021, the Oversight Board announced its first six decisions based off of cases recommended to the Oversight Board by Facebook. One of the decisions included an October 2020 case in which Facebook removed a post which advocated that the French government permit the prescription of hydroxychloroquine combined with azithromycin to be used against COVID-19. In its reversal, the Oversight Board noted that the post did not rise to the level of imminent harm required by its Community Standards of Facebook and that the decision did not comply with international human rights standards on limiting freedom of expression. The Oversight Board recommended that Facebook adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm. The Oversight Board also recommended that Facebook increase the transparency around how it moderates health misinformation, including by publishing a transparency report on how Community Standards are enforced.
However, the COVID-19 case is emblematic of the difficulties that arise when introducing quasi-legal processes into the corporate context. First and foremost, like any legal system, cases introduced to the Oversight Board take time to adjudicate. In the case above, the incident occurred in October of 2020 and a decision was not rendered until January 2021. With important life altering decisions like this, three or four months is a significant time that could theoretically affect the life of many users (especially in light of reports that people have died for ingestion of hydroxychloroquine). If the situation had been reversed, and Facebook had left dangerous content up where it should have been taken down, one could hypothesize the great harm that could have been caused. Whereas private companies usually have the flexibility to be nimble and quickly respond to new policy and regulatory challenges they face, introducing a quasi-legal system will slow the ability for Facebook to timely finalize its decisions.
Secondly, and perhaps most important to Facebook’s PR perspective, the quasi-judicial system has not inoculated Facebook from criticism. It is telling that Facebook has not used the Oversight Board as a shield in responding to criticism over its regulation of misinformation – especially because the Oversight Board actually recommended less content moderation in certain situations — because such an explanation would likely be unpalatable to most of its critics. In fact, Facebook specifically noted in its response to the Oversight Board’s recommendations that it publicly disagreed with the recommendation that it adopt less intrusive means, and would continue to remove misinformation based on consultation with the CDC and WHO. Therefore, Facebook had to push back against its own Oversight Board to defend itself from further public criticism.
Finally, and most importantly for those who care about COVID-19 and the safety of the community, it is not clear that the Oversight Board’s decision was the right one. While hindsight is 20/20, the influence of misinformation on Facebook’s platform, the reluctance of some people to take COVID-19 precautions such as masks and vaccines, and the increasing prevalence of the COVID-19 Delta variant highlight how important appropriately dealing with this problem is. For example, a popular theory propagated on Facebook alleges that the COVID-19 vaccine is being used by the U.S. government to microchip the population. In a recent YouGov poll, one in five Americans said they believe that theory.
As Facebook and the public are finding out, making a process more independent does not guarantee that the process will achieve the correct answer. The Supreme Court is littered with decisions that have been shown to be inherently problematic (e.g., Plessy v. Ferguson, Citizens United v. FEC, Korematsu v. United States). Similarly, just because the Oversight Board is stocked with global experts in a variety of different fields does not prevent it from codifying decisions that may incorrectly weigh harms versus freedoms. The Oversight Board’s recommendation that misinformation content be corrected instead of removed looks foolhardy during the current deteriorating situation. It’s other recommendation that misinformation guidelines be clarified may also prove unworkable. Much like Justice Potter Stewart’s infamous quote regarding obscenity, misinformation is hard to define but easy to recognize – providing clear regulation on what counts as misinformation is difficult to define and implement. Moreover, it’s not clear if the Oversight Board will revisit the health information issue or what, if any, appetite they have to reverse their own opinions. Therefore, it is quite possible that this decision will stand and influence further Facebook decisions as the pandemic gets worse.
In sum, while the Oversight Board once held significant promise, and while it still might prove itself to be a useful tool that forever changes policy implementation for private companies, the COVID-19 situation has shown it may not be the panacea it was heralded to be.