TikTok Fails to Detect Disinformation Ads Targeting EU Elections, Raising Concerns About Platform’s Ability to Safeguard Democratic Processes
A new investigation by Global Witness has revealed alarming vulnerabilities in TikTok’s advertising review process, raising serious concerns about the platform’s capacity to prevent the spread of disinformation ahead of the upcoming European Parliament elections. The investigation, which tested the ability of major social media platforms to detect and reject ads containing false and misleading information related to the elections, found that TikTok approved every single disinformation ad submitted, while YouTube and X (formerly Twitter) demonstrated a significantly higher capacity to identify and block such content. This disparity underscores the urgent need for stricter enforcement of the EU’s Digital Services Act and for social media companies to prioritize election integrity across all regions and languages.
The investigation focused on Ireland, where many tech platforms have European headquarters or hubs. Global Witness created 16 short video ads containing disinformation designed to suppress voter turnout and undermine the electoral process. The ads featured fabricated claims about altered ballots, changes to ID requirements, and polling station closures due to infectious disease outbreaks. These claims directly contravene EU regulations and TikTok’s own community guidelines, which explicitly prohibit political advertising and election disinformation.
While X blocked all 16 ads and suspended the account used for the test, and YouTube rejected all but two, TikTok approved all of the disinformation ads for publication. This 100% approval rate stands in stark contrast to the performance of X and YouTube, which demonstrated a capacity to detect at least some of the harmful content. Following the review process, Global Witness withdrew the ads before they could go live, ensuring that no users were exposed to the disinformation.
To further assess TikTok’s post-publication review process, Global Witness submitted a seemingly innocuous political ad stating "It’s an election year!" This ad, which also violates TikTok’s ban on political advertising, was approved and garnered over 12,000 impressions in less than an hour. This finding suggests significant gaps in TikTok’s real-time monitoring capabilities, raising concerns that even if some harmful content is initially flagged, it can still reach a substantial audience.
TikTok’s failure to detect these disinformation ads is particularly concerning given the platform’s growing popularity, especially among young voters. With many young people relying on social media for political information, TikTok’s vulnerability to manipulation poses a serious threat to the integrity of the democratic process. This investigation highlights the urgent need for TikTok to address the shortcomings in its content moderation systems and to proactively enforce its own policies against election disinformation.
The implications of this investigation extend beyond TikTok and underscore the broader challenges of regulating online political advertising and combating disinformation. While YouTube and X demonstrated a greater ability to identify and block the disinformation ads in this specific instance, their track record in other regions and past elections has been inconsistent, highlighting the need for consistent and equitable enforcement of content moderation policies across all platforms and geographies. The upcoming European Parliament elections serve as a critical test for the efficacy of the EU’s Digital Services Act and the commitment of social media companies to safeguarding democratic processes against online manipulation.
The EU’s Digital Services Act and the Responsibility of Big Tech
The timing of this investigation coincides with the full implementation of the EU’s Digital Services Act (DSA), which mandates that large social media companies take proactive measures to address systemic risks, including election interference. The DSA empowers EU regulators to impose significant fines and even bans on platforms that fail to comply. Global Witness has filed a complaint with the EU regulator, providing evidence of TikTok’s failure to detect the disinformation ads and urging enforcement action.
TikTok’s response to the investigation acknowledges the violation of its advertising policies and attributes the approval of the ads to human error by a single moderator. The company claims to have retrained the moderator and implemented new procedures to prevent future errors. However, the scale of the failure – approving 100% of the disinformation ads – suggests deeper systemic issues beyond individual moderator error. The rapid spread of the innocuous "It’s an election year!" ad further underscores the need for more robust real-time monitoring and enforcement mechanisms.
The investigation’s findings highlight the need for increased transparency and accountability from social media platforms. Platforms should publicly disclose their content moderation policies, provide access to ad repositories for all countries, and conduct independent evaluations of the impact of their policies on democracy and human rights. Furthermore, it is crucial that these measures are applied equitably across all elections globally, regardless of region or language, ensuring that platforms do not prioritize Western elections over those in other parts of the world.
Recommendations for Social Media Companies and Policymakers
To effectively address the threat of online disinformation and safeguard democratic processes, social media companies must prioritize the following recommendations:
- Increased Resourcing and Transparency: Dedicate adequate resources to content moderation efforts and be transparent about the processes and technologies used.
- Robust Policy Enforcement: Implement and consistently enforce strong policies against election-related disinformation for both organic content and paid advertisements.
- Public Ad Repositories: Establish publicly accessible repositories for all political ads across all countries.
- Independent Evaluations: Conduct regular and public evaluations of the impact of content moderation policies on democracy and human rights.
- Global Equity: Ensure that content moderation policies and enforcement efforts are applied consistently across all elections worldwide, regardless of region or language.
The findings of this investigation serve as a wake-up call for social media companies and policymakers alike. As online platforms become increasingly influential in shaping public discourse and political outcomes, it is essential to strengthen regulatory frameworks and ensure that platforms are held accountable for their role in safeguarding democratic processes. The upcoming European Parliament elections represent a crucial opportunity to demonstrate the effectiveness of the DSA and to hold social media companies to their commitments to protect election integrity.