When it came to the pandemic and COVID denialism, Facebook repeated the mistakes of the past. At each turn, Facebook took an overly narrow approach that avoided addressing the root of the problem. As a result, COVID denialism continues to fester on the platform, and the pandemic persists.
In March 2020, as the pandemic was ramping up in the United States, Facebook announced that the platform had taken action. However, it was limited primarily to false claims about COVID cures and misinformation about essential services, outbreak severity, and “misinformation that could cause immediate physical harm.”[54] For everything else, Facebook outsourced the decisions.
“For claims that don’t directly result in physical harm, like conspiracy theories about the origin of the virus, we continue to work with our network of over 55 fact-checking partners covering over 45 languages to debunk these claims,” wrote Facebook’s Vice President of Global Affairs and Communications, Nick Clegg.[55]
For the United States, fact-checking partners consisted of only AFP, the Associated Press, Check Your Fact, The Dispatch, Factcheck.org, Lead Stories, Science Feedback, Reuters Fact Check, and USA Today.[56] No organizations with expertise in misinformation or functionally equipped to analyze how the far-right might weaponize misinformation about the virus were included as partners. Instead, fact-checking partners were tasked to essentially act as referees, calling balls and strikes on misinformation.
Facebook’s fact-checking process concentrated on articles published outside the platform, not on the Facebook user-generated content. In May 2020, for instance, Facebook bragged that independent fact-checkers put warning labels on around 7,500 articles.[57] Yet misinformation was rampant on the platform.
The focus on “misinformation” is, itself, misleading. It places an emphasis exclusively on the spread of false information, as though misinformation acts by itself in a vacuum. It tends to obscure the human element of both individual and collective action. The misinformation framework also shifts focus to a true/false binary, thereby eliminating discussion of the role of ideology or conspiracy theories. Finally, it sidesteps crucial political questions about how the platform may be hindering efforts to stop the spread of a pandemic.
A misinformation framework centers on false information that is spread, regardless of intent, to mislead. Collective action, grounded in misinformation or conspiracy theories, poses an even more significant challenge. There is little evidence that misinformation itself can create a social movement. Instead, far-right movements create conspiracies and misinformation to create mythology and further movement aims.
Public health officials have noted that accurate information is vital to the public, particularly during the early stages of a pandemic. Facebook’s position during this early period was to tweak rather than address the problem head-on. Even when fact-checkers determined that something was misinformation, Facebook made a choice not to remove it. According to Facebook’s Nick Clegg, “Once a post is rated false by a fact-checker, we reduce its distribution so fewer people see it, and we show strong warning labels and notifications to people who still come across it, try to share it or already have.”[58]
Clegg’s announcement did not mention the growing number of groups popping up on the platform dedicated to COVID denialism. These groups relied on the platform to find followers and organize COVID denial events. By April 2020, IREHR identified over 200 COVID denialist groups with over 1.2 million members. Once individuals joined these groups, the limited policies Facebook put in place became almost meaningless.
The structure of these Facebook groups also short-circuited Facebook’s attempts at promoting accurate information about the pandemic. The reinforcing conspiracy-think of many in these groups makes it difficult for any challenging information to get through, something compounded by these forums fostering an insulated and “trusted” community of the like-minded.
Facebook groups are virtually impenetrable to Facebook’s own COVID misinformation efforts. Facebook’s approach centered on cutting off outside misinformation from entering the platform. Facebook groups stood the misinformation dynamic on its head. This time, the COVID misinformation is coming from inside the platform.
Facebook groups create an intimacy of connection between individuals, more like a peer-to-peer messaging app than a newsfeed. Indeed, Facebook promoted Groups as intimate, trusted, private spaces that create community.[59] Unfortunately, COVID denial groups are weaponizing these trusted spaces.
Rather than some outside article serving as a reliable source of information, this closeness of contact creates trust in other group members. As a result, people receive and share information directly to and from others who have become their closest contacts and their “trusted” sources of information—these cognitive bubbles silo users in a never-ending deluge of misinformation. Groups create an uninterrupted feedback loop, continually reinforcing misinformation and increasing radicalization.
As Nina Jankowicz, a disinformation fellow at the Wilson Center, and Cindy Otis, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab, noted, “despite the company’s recent efforts to crack down on misinformation related to COVID-19, the Groups feature continues to serve as a vector for lies.”[60]
For COVID denial Facebook group members, there are a plethora of spaces where individuals can go to have an entire community agree with them that, for instance, masks are the worst-ever assault on freedom. In these spaces, there is little criticism or dissent. Instead, declarations of such sentiment get repeatedly liked, cheered, and shared. There is also often a dynamic in play where the sentiment builds upon itself. So a statement that “masks are terrible” is one-upped by a reply that “masks are slavery,” which gets a response that “masks are like the Holocaust,” and so on. There are no brakes on the Facebook group radicalization train.
On top of this, Facebook tweaked the platform to “increase engagement” and get more users into groups. “Related Discussions” push material from other COVID denial groups into users’ news feeds, exposing them to more COVID denial groups and additional sources of misinformation. In addition, Facebook’s recommendation engines, “suggested groups,” and the site’s search results further drive people to more militant groups they might not otherwise find.
The Private Group Problem
On Facebook, there are two different types of groups. Public groups are available for anyone to join, content from the group is visible to anyone on the platform, and content appears in Facebook searches.
Private groups, on the other hand, are shielded from the public eye. Users have to be let into a private group by an administrator. Vetting questions are often asked of prospective members to screen out people who might challenge the prevailing wisdom inside the group. Content from private groups is inaccessible to non-group-members. These settings make tracking the full extent of far-right activity in COVID denial Facebook groups more complicated.
Taking it to the Streets
Zuckerberg’s earlier Tea Party comment is also an important reminder that what happens on Facebook doesn’t stay on Facebook. There are real-world consequences when far-right movements are incubated on Facebook and unleashed on the world. Groups also make it easy to organize events, rapidly moving COVID denialism off the platform and into the real world. From attacks on hospitals and doctors to threats against teachers and school board members to organized efforts to stop policies to curtail COVID-19, Facebook groups are helping create a toxic public health environment. (More on these real-world impacts in section three).
At the peak during the first wave of Facebook COVID denial groups, IREHR tracked 1186 groups with 3,032,085 members. These Facebook groups were where large COVID denial rallies were organized, including the armed storming of state capitol buildings, hanging elected officials in effigy, even threats to kidnap and murder public officials.
Gaming the system
When it comes to the problem of far-right infestation of the platform, in many ways, Facebook is mired in static thinking in a dynamic world. Far-right groups are constantly evolving and adapting to circumstances.
In physics, there is a concept known as the “observer effect” (often confused with the Heisenberg uncertainty principle) which explains that measurements of certain systems cannot be made without affecting the system, that is, without changing something in the system. In the case of Facebook, external events and internal efforts to counteract misinformation have changed the COVID denialist system on the platform.
This problem is not new for Facebook. For instance, last summer, two men met on Facebook in a far-right paramilitary Boogaloo group. The two are accused of murdering a federal officer in Oakland, and one of the men is also charged with the killing of a Santa Cruz sheriff’s officer. Facebook was widely criticized for being slow to react.[61] In response, Facebook banned the term “Boogaloo.” However, it took all of one day for Boogaloo activists to adapt, choosing new and different keywords to game the system.
This pattern is also playing out among COVID denial groups. Several have changed group names to avoid tripping Facebook’s automated systems. Others have changed names to reflect the changing sentiment of the group. For instance, Floridians Against Excessive Quarantine changed the group’s name to Patriot Floridians for FREEDOM over fear! How to avoid a Facebook strike is a common topic in COVID denial groups on Facebook, particularly those in the anti-vaxx and anti-mask wings.
Warning Labels Return
As part of the platform’s Community Standards Enforcement Report, Facebook announced that between April and June 2021, it removed 20 million posts that contained COVID-19 misinformation.[62] The platform also said that warning labels had been added to more than 190 million COVID-19-related posts.
Guy Rosen, a Vice President for Integrity at Facebook, wrote back in April 2020, “When people saw those warning labels, 95% of the time they did not go on to view the original content.”[63] Data to support these claims has not been made public. It’s also worth noting that Rosen’s claims were early in the pandemic. Facebook recently reported it added warning labels to more than 190 million COVID-19-related posts since the start of the pandemic. By Facebook’s metrics, those numbers suggest the strategy doesn’t seem to be deterring the spread of misinformation, either.
Warning labels may work for the vaccine-hesitant, but they do nothing for the ideologically hardened activists who have been fighting against COVID-19 health and safety measures for over a year. For individuals already drawn into the expanding world of COVID denial Facebook groups, the warning labels on COVID misinformation seem to be about as successful as Parental Advisory stickers were in deterring kids listening to hip hop in the 90s. Not only does the label appear to attract immediate attention to the labeled piece, but it also fuels the COVID denialist victim complex. Members of these groups will often angrily lash out at Facebook (on Facebook) for “unfairly” targeting them or that Facebook is part of the conspiracy to keep information hidden.