Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Defending Against Disinformation: Identifying and Refuting Falsehoods.

June 8, 2025

Campaign Challenges Misinformation in “Superman & Lois” Political Narrative

June 8, 2025

False Claims Regarding Child Euthanasia in Canada

June 8, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Social Media»TikTok and Facebook’s Disinformation Mitigation Efforts Fall Short Prior to US Election
Social Media

TikTok and Facebook’s Disinformation Mitigation Efforts Fall Short Prior to US Election

Press RoomBy Press RoomMarch 27, 2025
Facebook Twitter Pinterest LinkedIn Tumblr Email

Social Media Platforms Fail to Fully Curb Election Disinformation Ahead of 2024 US Presidential Election

A new investigation by Global Witness has revealed alarming vulnerabilities in the ability of major social media platforms, namely YouTube, Facebook, and TikTok, to detect and remove harmful election disinformation in the lead-up to the 2024 US presidential election. This comes just two years after a similar investigation exposed similar flaws, raising concerns about the platforms’ commitment to safeguarding electoral integrity. The investigation tested the platforms’ moderation systems by submitting ads containing various forms of election disinformation, including false voting information, voter suppression tactics, threats against election workers, questioning candidate eligibility, and inciting violence. These ads were designed using "algospeak," a tactic employing numbers and symbols to bypass content moderation filters.

TikTok emerged as the worst performer, approving 50% of the disinformation-laden ads despite an explicit ban on all political advertising. While this represents an improvement from the 90% approval rate in the 2022 midterm elections, it remains a significant concern given TikTok’s growing influence and its popularity among young voters. The platform’s detection system appeared ineffective, failing to identify ads containing clear disinformation as long as they didn’t mention candidates by name. TikTok attributed the approvals to errors in its machine moderation system and emphasized the multi-layered review process, including human moderation, that ads undergo before publication. They committed to using the findings to improve future detection and reiterated their ongoing efforts to enhance policy enforcement.

Facebook demonstrated a marked improvement compared to previous performance, rejecting seven out of eight submitted ads. However, the one approved ad falsely claimed that a driver’s license is required for voting, underscoring the continuing presence of vulnerabilities in their systems. While Facebook’s performance in the US has improved, its struggles to curb disinformation in other elections, such as in Brazil, raise concerns about the consistency of its global enforcement efforts. The lack of response from Meta, Facebook’s parent company, to requests for comment further fuels these concerns.

YouTube, while rejecting half of the submitted ads, also presented a complex picture. The platform paused the testing account and requested further verification, leaving unanswered the question of whether the remaining ads would have been approved had verification been provided. Notably, only one rejection was explicitly based on "unreliable claims," suggesting a potential gap in identifying disinformation unless directly related to election processes. Like Facebook, YouTube’s inconsistencies in tackling disinformation across different countries, particularly its failure to detect disinformation in ads related to the Indian elections, highlight the need for more robust global enforcement. A Google spokesperson highlighted their multi-layered review process and commitment to continuous improvement in policy enforcement.

The investigation’s findings underscore the urgent need for increased content moderation capabilities and robust integrity systems across all platforms, not just in the US, but globally. Properly resourcing content moderation efforts, including fair wages and psychological support for moderators, is crucial. Platforms must also proactively assess and mitigate risks to human rights and societal harms, publish transparency reports detailing their election integrity initiatives, and allow independent audits for accountability.

Specific recommendations for Meta include strengthening ad account verification processes and urgently bolstering its content moderation systems. For TikTok, the immediate priority is enhancing its systems for identifying political content and enforcing existing rules, along with a significant upgrade to its disinformation detection capabilities. Ultimately, the responsibility rests on these platforms to prioritize the protection of democratic processes worldwide by implementing comprehensive and consistent measures against election disinformation. The upcoming US presidential election serves as a critical test for their commitment to this responsibility.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Limited Impact of Social Media Information Operations in Pakistan

June 7, 2025

Identifying Misinformation on Social Media: Ten Strategies

June 6, 2025

OpenAI Terminates ChatGPT Accounts Associated with State-Sponsored Cyberattacks and Disinformation Campaigns

June 6, 2025

Our Picks

Campaign Challenges Misinformation in “Superman & Lois” Political Narrative

June 8, 2025

False Claims Regarding Child Euthanasia in Canada

June 8, 2025

Government Cautions Public Regarding Thai-Cambodian Disinformation

June 8, 2025

The United States Must Cease Disinformation Campaigns and Rectify Erroneous Actions

June 8, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Russia’s Strategic Disinformation Campaign and Concealment of War Crimes

By Press RoomJune 8, 20250

Russia’s Systematic Weaponization of Information: A Blueprint for Atrocity A groundbreaking May 2025 report by…

Russia Constructs False Narrative Regarding Prisoner Exchange to Discredit Ukraine

June 8, 2025

Pentagon Concealed Weapons Programs Under Guise of UFO Investigations, WSJ Reports

June 8, 2025

Kashmir Cyber Police Threaten Legal Action Against Misinformation

June 8, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.