Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Disinformation»EU Commission Cites Big Tech’s Insufficient Disinformation Efforts
Disinformation

EU Commission Cites Big Tech’s Insufficient Disinformation Efforts

Press RoomBy Press RoomJune 26, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

EU Disinformation Code: Tech Giants Fall Short, Transparency and Data Access Remain Key Challenges

A new report from the European Digital Media Observatory (EDMO) reveals a significant gap between the commitments made by major tech platforms like Google, Meta, Microsoft, and TikTok to combat disinformation and the actual implementation of those commitments. The report, covering the first half of 2024, assesses the platforms’ adherence to the eight core commitments outlined in the EU’s Code of Practice on Disinformation. This voluntary code, established in 2018, will be integrated into the legally binding Digital Services Act (DSA) on July 1st, raising the stakes for these companies. The EDMO’s findings paint a concerning picture, highlighting consistent deficiencies in transparency, independent oversight, and measurable outcomes across all assessed commitments. The report warns that without substantial improvements, the code risks becoming a performative exercise rather than a genuine effort to tackle the spread of disinformation.

The core commitments of the code encompass a range of actions, including preventing advertisements from appearing alongside disinformation, effectively labeling misleading or fake information, and providing researchers with access to platform data for independent analysis. The EDMO assessment, based on the platforms’ transparency reports, expert surveys, and internal research, reveals a widespread lack of comprehensive and detailed information on how these platforms are mitigating disinformation. The report employs a rating scale from "very poor" to "excellent" to evaluate the platforms’ transparency reports and concludes that overall efforts to combat disinformation “remain very limited, lacking consistency and meaningful engagement.” While Meta and Google have launched various initiatives, they are often criticized for lacking depth and impact, appearing more symbolic than substantive.

A key issue highlighted in the report is the limited accessibility and transparency surrounding tools designed to combat disinformation. Features like Google and Meta’s political ad and fact-checking labels, along with Microsoft’s “Content Integrity Tools,” remain difficult to access and evaluate. The report criticizes the lack of data on user engagement with these tools, particularly country-specific data. The absence of user engagement figures, reported outcomes, and information on the actual scale of these efforts raises concerns about their effectiveness. Similarly, initiatives related to media literacy, such as Meta’s “We Think Digital,” Microsoft’s partnership with NewsGuard, and Google’s pre-bunking efforts, are criticized for being high-level and lacking measurable data. This lack of transparency casts doubt on the genuine impact of these initiatives, raising concerns that they may be primarily "declarative gestures."

Furthermore, the report identifies a lack of transparency regarding the performance of fact-checking mechanisms. Meta, Google, and TikTok employ fact-checking panels, user prompts, notifications, and labels to address potentially misleading information. However, they fail to provide concrete data on the real-world performance of these tools. While Google reports "large-scale reach figures" for its fact-checking panels, it does not offer meaningful data on how user behavior changes after exposure to these fact-checks. This lack of data makes it difficult to assess the effectiveness of these interventions and their impact on the spread of disinformation. The report underscores the need for more granular data to understand how these tools are influencing user behavior and contributing to a more informed online environment.

The report also highlights significant shortcomings in the provision of data to researchers studying disinformation. While TikTok received a passing grade in this area, researchers still reported difficulties accessing data through its Research API due to an opaque application process. The other platforms offer access to "certain datasets" through researcher programs, but access remains highly restricted, hindering independent analysis and scrutiny. This restricted access to data limits researchers’ ability to understand the complex dynamics of disinformation on these platforms and develop effective countermeasures. The report emphasizes the importance of greater data transparency and accessibility to enable independent research and foster a more informed understanding of the disinformation landscape.

In conclusion, the EDMO report paints a critical picture of the current efforts by major tech platforms to combat disinformation. The findings reveal significant gaps between commitments and implementation, particularly in areas of transparency, data access, and measurable outcomes. As the EU’s Digital Services Act comes into effect, these companies face increasing pressure to demonstrate genuine commitment to combating disinformation. The report calls for greater transparency, more robust data sharing with researchers, and a shift from symbolic initiatives to concrete, measurable actions that demonstrably reduce the spread of disinformation online. The future of online information integrity hinges on the platforms’ willingness to embrace transparency, collaboration, and accountability in their efforts to combat this critical challenge.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

Contested Transitions: The Siege of Electoral Processes

September 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.