EU Report Finds Tech Giants Failing to Combat Disinformation Effectively
Brussels – A new report released by the European Digital Media Observatory (EDMO) reveals that major tech platforms, including Google, Meta, Microsoft, and TikTok, are falling short in their efforts to combat disinformation under existing EU regulations. The report, which assesses the period from January to June 2024, scrutinizes how these companies have adhered to the eight core commitments outlined in the EU’s Code of Practice on Disinformation. This voluntary code, established in 2018, will be integrated into the Digital Services Act (DSA) effective July 1st, 2024. The EDMO’s findings highlight a significant disconnect between the platforms’ stated commitments and the tangible evidence of their implementation, raising concerns about the effectiveness of self-regulatory measures in addressing the spread of disinformation.
The EDMO evaluation found a "clear gap" between the platforms’ pledges under the code of practice and their actual implementation, raising concerns that these efforts are largely "performative." The report identifies persistent shortcomings across all eight commitments, particularly in areas such as transparency, independent oversight, and measurable outcomes. The assessment methodology involved reviewing the platforms’ transparency reports, conducting expert surveys, and leveraging EDMO’s internal research. Key commitments assessed include avoiding advertisements alongside disinformation, effectively labeling misleading or fake information, and providing researchers with access to platform data.
A recurring theme in the report is the lack of concrete data provided by the platforms to demonstrate the effectiveness of their initiatives. While Meta and Google have launched various programs to address disinformation, the EDMO criticizes these efforts as often "superficial or symbolic." The report points to the difficulty in accessing tools like political ad and fact-checking labels on Google and Meta, as well as Microsoft’s "Content Integrity Tools." This lack of accessibility is compounded by the absence of data on user interaction with these tools across different countries, rendering it impossible to assess their actual impact. The report states, “There are no user engagement figures, no reported outcomes, and no indication of the actual scale of these efforts.”
Similar concerns were raised regarding the platforms’ commitments to media literacy. Initiatives like Meta’s “We Think Digital,” Microsoft’s partnership with NewsGuard, and Google’s pre-bunking “More About This Page” were deemed "high-level" projects lacking quantifiable data to support their effectiveness. The report questions whether these measures are genuine attempts to combat disinformation or merely "declarative gestures.” The lack of transparency surrounding the implementation and impact of these initiatives raises doubts about their true purpose.
The EDMO report also criticizes the lack of data provided regarding the efficacy of fact-checking initiatives. While Meta, Google, and TikTok offer fact-checking panels, user prompts, and labels to identify potentially misleading information, they fail to provide concrete data on their performance in real-world scenarios. Google, despite reporting "large-scale reach figures" for its fact-checking panels, does not offer "meaningful data" on how user behavior changes after encountering these labels. This lack of transparency hinders the assessment of the true impact of fact-checking efforts in combating disinformation.
Regarding data access for researchers, only TikTok received a passing grade, albeit with caveats. Researchers surveyed by the EDMO reported difficulties in accessing data from TikTok’s Research API database due to its "opaque application process." The other platforms, while offering access to "certain datasets" through researcher programs, maintain "highly restricted" access, limiting the scope of independent research on disinformation. This limited access to platform data hinders independent research and prevents a comprehensive understanding of disinformation trends.
The overall conclusion of the EDMO report is that the efforts of these tech giants to fight disinformation “remain very limited, lacking consistency and meaningful engagement.” The lack of transparency, measurable outcomes, and accessible data raise concerns about the effectiveness of self-regulation. The report serves as a call to action for these platforms to significantly improve their efforts and provide concrete evidence of their commitment to combating disinformation. As the DSA comes into effect, these platforms will face increased scrutiny and potential enforcement actions if they fail to meet their obligations in addressing the spread of disinformation.