Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Fake Information»Politically Asymmetric Sanctions: A Consequence of Disparate Misinformation Sharing
Fake Information

Politically Asymmetric Sanctions: A Consequence of Disparate Misinformation Sharing

Press RoomBy Press RoomDecember 17, 2024No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

2020 Election Study: An In-Depth Look at Misinformation and Account Suspensions on Twitter

The 2020 US presidential election was a highly contentious period marked by the proliferation of misinformation on social media platforms like Twitter. To understand the dynamics of misinformation sharing and its potential consequences, researchers conducted a comprehensive study examining the relationship between political orientation, low-quality news dissemination, and account suspensions on Twitter during and after the election. The study involved a multifaceted approach, including data collection from various sources, intricate statistical modeling, and policy simulations to glean insights into the complex interplay of these factors.

The researchers began by collecting a vast dataset of tweets from users who engaged with the election hashtags #Trump2020 and #VoteBidenHarris2020 on October 6, 2020. They also gathered data on the users’ tweeting history, including the domains they shared. This data was carefully filtered to focus on users who shared links from a specific set of news websites previously evaluated for credibility, ensuring a reliable basis for assessing news quality. This initial dataset comprised roughly 9,000 users, balanced between supporters of both presidential candidates. Nine months later, these accounts were revisited to determine whether they had been suspended by Twitter.

Crucially, the study relied on established methods for evaluating the quality of news sources shared by users. Recognizing the infeasibility of fact-checking individual tweets at scale, they utilized existing ratings of news website credibility from professional fact-checkers and politically balanced crowdsourced assessments. These ratings were aggregated into a "low-quality news sharing score" for each user, providing a quantifiable measure of their propensity to share potentially inaccurate information.

Assessing users’ political orientations was another key aspect of the study. Researchers employed a combination of methods, including hashtag usage, analysis of followed accounts, and the ideological leanings of shared news sources. They then combined these measures into an aggregate political orientation score, allowing for a nuanced understanding of users’ ideological positions along a continuous spectrum rather than simply categorizing them into binary groups.

To explore the potential impact of hypothetical suspension policies, researchers simulated different scenarios with varying levels of stringency. This allowed them to estimate the probability of suspension for users based on their low-quality news sharing behavior and gauge the potential for disparate impact on different political groups. These simulations were conducted using both low-quality news sharing and bot-likelihood as potential grounds for suspension.

The study’s scope extended beyond the primary Twitter dataset from the 2020 election. Researchers reanalyzed several existing datasets, including Facebook sharing data from 2016, multiple sets of Twitter data from 2018 to 2023, and datasets focusing on the sharing of false claims and COVID-19 misinformation. These additional datasets provided valuable opportunities for cross-validation and explored similar research questions in different contexts, bolstering the robustness of the findings.

The 2016 Facebook dataset focused on information sharing behavior in the aftermath of the 2016 US election. This dataset, collected via a Facebook app, included user self-reports of political ideology and links shared on the platform, allowing comparisons with the Twitter data. The Twitter datasets from 2018 to 2023 used different sampling methods, including those based on the following of political elites and stratification on follower count, adding depth and breadth to the investigation of misinformation sharing.

Further datasets directly examined the sharing of known false claims and COVID-19 misinformation. The false claims dataset focused on identifying Twitter users who shared specific false or true news headlines, providing a more direct measure of misinformation sharing than relying on news source quality ratings. The COVID-19 dataset gathered sharing intentions for true and false claims across 16 countries, allowing for cross-cultural comparisons of misinformation sharing behavior.

By combining these diverse datasets and employing rigorous methodologies, the study aimed to comprehensively analyze the relationship between political orientation, the spread of misinformation, and the implications of platform policies like account suspensions. These analyses contribute significantly to our understanding of online information ecosystems and their impact on democratic processes. The diverse range of data sources and timeframes offer a robust perspective on the complex challenges of misinformation, paving the way for more informed discussions and potential interventions to address this critical issue.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Minister Advises Responsible Social Media Usage in Nigeria

September 24, 2025

Purchase of Verified Accounts Increases Risk of Online Fraud

September 24, 2025

Automated Avatars Used in Covert Social Media Influence Operations Since 2011

September 24, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.