EU Bolsters Fight Against Disinformation: Online Platforms Face New Era of Accountability Under Strengthened Digital Services Act
Brussels – In a significant move to combat the spread of disinformation online, the European Commission and the European Board for Digital Services are set to integrate the Code of Conduct on Disinformation into the EU’s Digital Services Act (DSA). This landmark decision, announced on February 13, 2025, marks a critical shift from voluntary self-regulation to legally enforceable obligations for online platforms operating within the EU. Starting July 1, 2025, these platforms will face heightened scrutiny and accountability for their content moderation practices and transparency efforts.
The integration of the Code of Conduct, originally established in 2022, represents a decisive step in the EU’s ongoing efforts to address the pervasive challenge of online disinformation. Previously, adherence to the Code was voluntary, limiting its effectiveness in curbing the spread of false and misleading information. By incorporating its provisions into the DSA, the Commission aims to create a robust and enforceable framework that compels online platforms to take proactive measures against disinformation. This transition will empower regulators to audit compliance and impose penalties for violations, ensuring greater accountability from online platforms.
The strengthened DSA will mandate stricter content moderation practices, requiring platforms to implement more effective mechanisms for identifying and removing disinformation. These measures may include enhanced fact-checking procedures, improved reporting systems for users to flag potentially misleading content, and greater transparency regarding the algorithms used to curate and disseminate information. The new regulations also emphasize the importance of providing users with clear and accessible information about the sources and credibility of the content they encounter online.
The integration of the Code of Conduct into the DSA is expected to have far-reaching implications for the online ecosystem, prompting significant changes in how platforms operate and moderate content. Platforms will need to invest in robust content moderation systems and develop clear policies to comply with the new regulations. This may involve hiring additional staff, implementing advanced technological solutions, and collaborating with fact-checking organizations. Failure to comply with the strengthened DSA could result in substantial fines and other penalties.
This move by the EU is part of a broader global trend towards greater regulation of online platforms, reflecting growing concerns about the impact of disinformation on democratic processes, public health, and societal cohesion. Other countries and international organizations are also exploring measures to address the spread of false and misleading information online, with a focus on increasing transparency and accountability. The EU’s strengthened DSA is expected to serve as a model for other jurisdictions seeking to regulate online platforms more effectively.
The enhanced DSA promises a more robust framework for combating disinformation within the EU, but its success will depend on effective implementation and ongoing evaluation. The Commission and the European Board for Digital Services will play a crucial role in monitoring compliance, enforcing the new rules, and adapting the framework as needed to address evolving challenges. The collaboration of online platforms, civil society organizations, and other stakeholders will also be essential in fostering a healthier and more trustworthy online environment. Regular review and refinement of the DSA will be necessary to ensure its continued effectiveness in the face of evolving disinformation tactics and technological advancements.