Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Valdosta Police Department Clarifies Information Regarding Valdosta Mall Incident

August 5, 2025

Valdosta Police Department Clarifies Information Regarding Valdosta Mall Incident

August 4, 2025

Legendary Meteorologist Launches Weather Network to Champion Truth and Combat Misinformation

August 4, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Social Media»An Assessment of the Online Safety Act’s Suitability for its Intended Purpose
Social Media

An Assessment of the Online Safety Act’s Suitability for its Intended Purpose

Press RoomBy Press RoomFebruary 16, 2025
Facebook Twitter Pinterest LinkedIn Tumblr Email

The UK’s Online Safety Act: A Critical Examination in the Wake of Recent Riots

The recent riots that erupted across the UK have cast a harsh spotlight on the role of social media platforms in amplifying hate speech, disseminating misinformation, and even coordinating public disorder. The timing of these events coincides with the enactment of the UK’s Online Safety Act 2023, legislation heralded as a landmark achievement in making the online world safer. This juxtaposition naturally raises questions about the Act’s efficacy, its preparedness for handling crises like the recent riots, and whether it possesses the necessary tools to tackle the complex challenges posed by online harms.

While the Online Safety Act aims to establish a robust framework for online safety, its impact on the recent riots and similar future events remains limited due to its phased implementation. The Act’s core duties, requiring online service providers to assess and mitigate risks associated with both criminal content and content harmful to children, are not yet in force. These duties hinge on guidance and codes of practice from Ofcom, the designated regulator, which are still under development and not expected to be finalized until early 2025. This delay means the Act’s provisions could not have directly impacted the recent unrest, even if the nature of the content fell within its scope.

The Online Safety Act primarily focuses on systemic improvements in content moderation and risk management by online service providers. It does not empower Ofcom or the government to dictate the removal of specific content items. While hate speech, incitement to violence, and the organization of riots would likely fall under the Act’s purview once operational, its effectiveness in a rapidly evolving crisis situation remains uncertain. The Act’s emphasis on systemic change rather than individual content removal presents a challenge in responding to the immediate and dynamic nature of online activity during crises. This stands in contrast to the broadcasting regime, where Ofcom has the power to make decisions about the acceptability of specific programs.

The Act’s ability to combat mis- and disinformation, a significant factor in fueling the recent riots, is further complicated by the removal of provisions relating to content harmful to adults during the legislative process. This means misinformation is only addressed if it constitutes a criminal offence or if prohibited by a platform’s terms of service, which the Act doesn’t mandate. This leaves a potential gap in addressing misinformation that, while harmful, may not reach the threshold of criminality. In the case of the riots, the spread of false information about the attacker in Southport demonstrably exacerbated tensions. The Act’s focus on assessing content against specific thresholds, rather than the amplification and spread of such content, presents a further limitation.

Existing legal mechanisms offer some avenues for addressing harmful online content, but they too have limitations in crisis situations. The video-sharing platform rules within the Communications Act 2003 mandate safeguards against illegal content and hate speech but, similarly to the Online Safety Act, do not empower direct content removal by Ofcom. These rules primarily apply to UK-based services, excluding many major international platforms. The video-on-demand rules, while potentially applicable to certain user-generated content deemed "on-demand programme services," primarily target UK-based providers and have seen limited enforcement by Ofcom.

The Online Safety Act does include a provision for "special circumstances" where the Secretary of State can direct Ofcom to take action in the face of a public safety threat. This mechanism, however, is primarily focused on transparency, requiring companies to explain their handling of the threat, rather than enabling direct content removal. It relies on the Secretary of State’s initiative and is reactive rather than proactive. Whether this provision is sufficient to address the rapid spread of harmful content in a crisis remains to be seen.

The recent riots and subsequent criticism of the Online Safety Act, including concerns raised by London Mayor Sadiq Khan, underscore the urgent need for a reevaluation of its scope and effectiveness. Addressing the current limitations, particularly its ability to act decisively in crisis situations and grapple with the complexities of mis- and disinformation, is crucial. The government’s stated intention to “look more broadly at social media” in light of the riots offers an opportunity to strengthen the Act and ensure it is equipped to address the evolving challenges of online safety in a rapidly changing digital landscape. This includes clarifying the threshold for intervention, streamlining processes for rapid response in crises, and potentially expanding the scope of the Act to address content harmful to adults. The evolving nature of online harms requires continuous adaptation of regulatory frameworks to ensure the online environment is both safe and conducive to free expression.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

The Southport Riot: A Catalyst for Anti-Immigrant Disinformation

August 4, 2025

Social Media Misinformation Exacerbates Crime.

August 4, 2025

Escalating Risks to the Civil Service

August 3, 2025

Our Picks

Valdosta Police Department Clarifies Information Regarding Valdosta Mall Incident

August 4, 2025

Legendary Meteorologist Launches Weather Network to Champion Truth and Combat Misinformation

August 4, 2025

Valdosta Police Department Clarifies Information Regarding Valdosta Mall Incident

August 4, 2025

Kazakhstan Establishes Center for Countering Disinformation

August 4, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Russian Disinformation Campaign Targets Moldovan Diaspora During Parliamentary Elections

By Press RoomAugust 4, 20250

Russia Escalates Disinformation Campaign Targeting Moldovan Diaspora Ahead of Parliamentary Elections Moldova is bracing for…

Disinformation Center Refutes Reports of Poland Ending Visa-Free Travel for Ukrainians

August 4, 2025

Kremlin-Backed Disinformation Campaign Targets Europe

August 4, 2025

Critique of Denver’s Anti-Meat Campaign Expenditures

August 4, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.