EU Formalizes Disinformation Code, Holding Online Platforms Accountable Under DSA

The European Union is taking a decisive step towards curbing the spread of disinformation online by formalizing voluntary commitments made by major online platforms under the Digital Services Act (DSA). This move, announced by the European Commission, integrates the 2022 Code of Practice on Disinformation into the DSA framework, establishing a concrete benchmark for evaluating compliance and holding platforms accountable for their role in combating online manipulation. By July, all significant online platforms, excluding Elon Musk’s X (formerly Twitter), will be subject to these formalized commitments, strengthening the EU’s efforts to ensure a safer and more transparent digital environment.

The code, originally signed by 42 companies, including tech giants like Google, Meta, Microsoft, and TikTok, outlines a comprehensive set of measures aimed at tackling disinformation. Key aspects include enhanced transparency in political advertising, fostering collaboration during elections, and promoting independent fact-checking. Formalizing these previously voluntary commitments empowers the Commission to more effectively scrutinize platform practices and ensure adherence to the DSA’s stringent requirements. While signing the code doesn’t grant platforms immunity from investigation, it provides a framework for assessing their efforts and identifying areas where improvements are needed. The Commission emphasizes that this is not a "tick the box" exercise, but rather a commitment to meaningful engagement in combating disinformation.

The integration of the code comes as the Commission ramps up its enforcement of the DSA, which came into effect in August 2023. Since then, several investigations have been launched targeting platforms like X, TikTok, and Meta’s Facebook and Instagram, examining their content moderation practices, advertising transparency, and compliance with the DSA’s regulations. These investigations highlight the Commission’s commitment to actively monitoring platform behavior and holding them responsible for upholding the DSA’s principles. The formalized disinformation code provides an additional layer of scrutiny, enabling the Commission to evaluate platforms’ efforts against a clear set of standards.

The notable absence of X from the list of compliant platforms underscores the ongoing tension between the company and the EU regulatory bodies. X’s withdrawal from the code following Musk’s acquisition in 2022 has raised concerns about the platform’s commitment to combating disinformation. While Meta, despite internal policy shifts regarding fact-checking in the US, remains committed to the code, the Commission acknowledges its inability to compel participation. This underscores the challenges of regulating a rapidly evolving digital landscape and the need for robust mechanisms to ensure platform accountability, even in the face of voluntary withdrawals.

The Commission’s scrutiny of X extends beyond its withdrawal from the disinformation code. Throughout 2023 and 2024, the Commission has conducted extensive investigations into X’s operations, focusing on content moderation practices, advertising transparency, and potential algorithmic manipulation for political purposes. Several breaches of the DSA have been identified, including failure to provide essential data for research, non-compliance with advertising transparency rules, and misuse of the verification system. These violations have triggered a deeper investigation into X’s role in potentially amplifying far-right political messaging, particularly in the context of the German parliamentary elections in 2025.

The Commission’s concerns intensified following a controversial livestream featuring Elon Musk and Alice Weidel, leader of Germany’s nationalist AfD party, raising suspicions of algorithmic manipulation to favor specific political agendas. Consequently, X has been ordered to provide detailed information about recent algorithmic changes and retain relevant data concerning its content-serving practices. The outcome of this investigation could lead to significant fines and stricter oversight of X’s operations within the EU, highlighting the potential consequences of non-compliance with the DSA and underscoring the Commission’s resolve to enforce its regulations. The formalization of the disinformation code, coupled with the ongoing investigations, demonstrates the EU’s multifaceted approach to tackling online disinformation and holding platforms accountable for their role in shaping the digital information landscape.

Share.
Exit mobile version