Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

DC Police and Advocates Address Social Media Misinformation Regarding Missing Persons

July 1, 2025

Chesapeake Bay Foundation Perpetuates Inaccurate Claims Regarding Menhaden.

June 30, 2025

Ukraine Forewarns of Potential Russian Disinformation Campaign in Advance of BRICS Summit

June 30, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Social Media»Mitigating Disinformation on TikTok: Corporate Policies for Youth Protection
Social Media

Mitigating Disinformation on TikTok: Corporate Policies for Youth Protection

Press RoomBy Press RoomJanuary 12, 2025
Facebook Twitter Pinterest LinkedIn Tumblr Email

The Rise of TikTok as a News Source and the Battle Against Disinformation

Disinformation, the deliberate spread of false or misleading information, has emerged as a significant threat to modern society and democratic processes. A 2022 study revealed that the vast majority of Spaniards are concerned about the issue, with a substantial number admitting to having fallen prey to false information online. This concern is amplified by the growing influence of social networks, particularly among younger demographics, as primary sources of news and information. Platforms like TikTok, with its immense popularity among Generation Z, have become key battlegrounds in the fight against the proliferation of disinformation. This article explores the corporate policies implemented by TikTok to combat disinformation, focusing on the European legal framework, youth content consumption patterns, and the platform’s specific strategies for content moderation and collaboration with fact-checkers.

TikTok’s position as a leading social network, particularly its prominence among Generation Z (16-24 years old), necessitates a robust approach to combating disinformation. Research indicates that a significant percentage of young people rely on social networks as their primary news source, a shift away from traditional media consumption. This trend has placed TikTok in a unique position of responsibility, given its algorithm-driven content recommendation system, the "For You Feed" (FYF), which can personalize and inadvertently amplify disinformation. The platform’s addictive nature, characterized by "rabbit hole" effects where users consume vast quantities of similar content, further exacerbates the risk of exposure to false information.

The European Union’s Digital Services Act (DSA), enacted in 2022, has established a comprehensive legal framework to address the challenges posed by disinformation online. This legislation places significant obligations on Very Large Online Platforms (VLOPs), including TikTok, to implement measures that limit the spread of illegal content, protect minors, and provide users with greater transparency regarding content moderation practices. The DSA’s impact on TikTok is substantial, requiring the platform to adapt its content moderation processes, increase transparency, and address potential risks associated with its design and algorithms, including addictive behaviors and exposure to harmful content.

TikTok has implemented various corporate policies to comply with the DSA and address the issue of disinformation. These policies can be categorized into three main areas: institutional collaboration, content moderation transparency, and fact-checking partnerships. Regarding institutional collaboration, TikTok has signed onto the EU’s Code of Practice on Disinformation, committing to a series of measures aimed at combating the spread of false information. However, the platform has also opted out of several commitments, notably those related to empowering users with tools to assess content provenance and trustworthiness. This decision raises concerns about the platform’s commitment to user empowerment in the fight against disinformation.

Transparency in content moderation practices is a key requirement of the DSA. TikTok is obligated to provide detailed reports on its content moderation activities, including the reasons for restricting or removing content. Data from the DSA Transparency Database reveal that TikTok has been highly active in content moderation, submitting a substantial number of Statements of Reasons (SoRs) related to content removal or restriction. The platform employs a combination of automated moderation technology and human review teams to identify and address violations of its Community Guidelines. However, the use of practices like "shadow banning," where content visibility is reduced without user notification, raises concerns about transparency and potential censorship.

Collaboration with fact-checking organizations is another crucial aspect of TikTok’s strategy against disinformation. The platform partners with several independent fact-checkers to assess the accuracy of content flagged as potentially misleading. This process involves reviewing flagged content against a global database of misinformation and taking appropriate action, such as content removal or labeling. While this collaboration represents a positive step, the limited number of fact-checking partners and the reliance on user reporting for content flagging raise questions about the comprehensiveness of this approach.

While the DSA and TikTok’s corporate policies represent significant steps towards combating disinformation, several challenges and concerns remain. The effectiveness of these policies in mitigating the spread of false information among young audiences, TikTok’s primary user base, requires further evaluation. The platform’s design and algorithms, particularly the FYF’s potential to create "rabbit hole" effects, continue to pose challenges in controlling the flow of disinformation. Additionally, the practice of "shadow banning" raises concerns about transparency and potential impacts on freedom of expression. Moreover, several governments and institutions have banned TikTok from official devices due to cybersecurity concerns, further highlighting the need for robust oversight and accountability.

Looking ahead, continued monitoring and evaluation of TikTok’s policies, coupled with stronger enforcement mechanisms, are essential to ensure their effectiveness in addressing the complex challenges of disinformation. Empowering young users with media literacy skills and critical thinking abilities is crucial to combatting the susceptibility to false information. Collaboration between platforms, policymakers, educators, and civil society organizations is vital to create a safer and more informed online environment, particularly for vulnerable youth audiences. The ongoing investigations into TikTok’s practices under the DSA underscore the need for continuous scrutiny and adaptation to ensure that the platform fulfills its responsibilities in protecting users from the harmful effects of disinformation.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

The Impact of Social Media, Disinformation, and AI on the 2024 U.S. Presidential Election

June 29, 2025

Limerick College Launches Forum on Misinformation

June 29, 2025

Combating Misinformation on Social Media: The Role of Artificial Intelligence

June 28, 2025

Our Picks

Chesapeake Bay Foundation Perpetuates Inaccurate Claims Regarding Menhaden.

June 30, 2025

Ukraine Forewarns of Potential Russian Disinformation Campaign in Advance of BRICS Summit

June 30, 2025

Analysis of Misinformation Spread by Alabama Arise Regarding “Big, Beautiful Bill”

June 30, 2025

Michigan Supreme Court Declines Appeal in Election Disinformation Robocall Case

June 30, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

News

AI-Generated YouTube Videos Propagate Misinformation Regarding Diddy Controversy.

By Press RoomJune 30, 20250

The Rise of AI-Generated Disinformation on YouTube: A Deep Dive into the "Diddy" Phenomenon In…

UN Expert Advocates for Decarbonizing the Global Economy and Penalizing Fossil Fuel Companies for Climate Disinformation

June 30, 2025

Ex-Newsnight Anchor Cautions Against Impending Flood of Misinformation

June 30, 2025

Sino-Russian Cooperation in International Information Warfare

June 30, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.