How Misinformation Spread on Facebook During the 2020 US Election:
The 2020 US election cycle was marked by a deluge of misinformation, fueled by a global pandemic, political polarization, and the decline of trust in traditional media. Social media platforms, particularly Facebook, became breeding grounds for false narratives, prompting concerns about their impact on the democratic process. New research from Northeastern University sheds light on the distinct patterns of misinformation dissemination on Facebook during this crucial period, revealing insights into how false content circumvented platform policies and proliferated through peer-to-peer networks.
The study, conducted by David Lazer, distinguished professor of political science and computer sciences at Northeastern, and his colleagues, analyzed all Facebook posts shared at least once between the summer of 2020 and February 1, 2021. Their findings, published in Sociological Science, contrast the spread of misinformation with the typical dissemination of content on the platform. While most content experiences a “big bang” distribution pattern, emanating rapidly from large Pages and Groups, misinformation followed a different trajectory.
Traditional content dissemination on Facebook resembles a wide, shallow tree. A post from a popular Page, for instance, instantly reaches millions of followers, with subsequent reshares by individual users forming the shorter branches. This pattern reflects the centralized nature of information sharing, driven by established entities with vast audiences. However, misinformation exhibited a "slow burn" pattern, gradually expanding through individual-to-individual sharing within smaller networks. This suggests a shift from centralized broadcasting to decentralized, peer-to-peer dissemination.
The researchers attribute this distinct pattern to two primary factors. Firstly, Facebook’s policies at the time placed restrictions on misinformation originating from Pages and Groups, while individual users remained largely unchecked. This inadvertently created a loophole, pushing misinformation sharing away from established sources and toward individual accounts. Secondly, Facebook’s friend limit for individual accounts (5,000) contributed to the gradual, viral spread, contrasting with the potentially explosive reach of Pages and Groups with unlimited followings.
Facebook’s crackdown on misinformation had mixed results. While it effectively limited the spread of false narratives from prominent sources, it simultaneously shifted the burden of dissemination to individual users. This highlighted the inherent challenges of content moderation in a decentralized online environment, demonstrating the adaptability of misinformation campaigns to exploit platform vulnerabilities. The researchers acknowledge that Facebook implemented multiple measures to combat misinformation, making it difficult to isolate the precise impact of each policy change.
Lazer describes Facebook’s strategy as employing "break-the-glass" measures, typically reserved for emergencies. These interventions, while intended to curb the immediate spread of harmful content, had unintended consequences. The inconsistencies in enforcement further complicated matters, with periods of heightened scrutiny around Election Day and the January 6 Capitol attack interspersed with periods of relative laxity. This uneven application of rules likely contributed to the ongoing circulation of misinformation.
The study’s findings underscore the challenges of regulating online misinformation, particularly in the face of evolving tactics and platform limitations. The shift from centralized to decentralized dissemination highlights the need for more comprehensive approaches that address the spread of false narratives at both the individual and organizational levels. Furthermore, the research emphasizes the importance of understanding the dynamics of social media platforms and their role in shaping the flow of information, especially during critical events like elections. Unfortunately, due to recent changes in Facebook’s data access policies, replicating this study for subsequent elections is no longer feasible, leaving a gap in our ongoing understanding of misinformation’s evolving landscape.