Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»News»Parliamentary Committee Raises Concerns about Online Safety Act’s Efficacy in Combating Misinformation
News

Parliamentary Committee Raises Concerns about Online Safety Act’s Efficacy in Combating Misinformation

Press RoomBy Press RoomJuly 11, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Parliamentary Report Slams Online Safety Act’s Ineffectiveness Against Misinformation

A scathing report from the Science, Innovation, and Technology Committee has cast serious doubt on the effectiveness of the Online Safety Act in combating the spread of misinformation online. The committee argues that the Act, designed to curb harmful digital content, falls short of protecting UK citizens from a “core and pervasive online harm” – the rapid proliferation of false and misleading information. This critique comes amidst growing concerns about the impact of misinformation on public discourse, safety, and democratic processes. The report’s findings raise significant questions about the government’s ability to regulate online content effectively and hold social media platforms accountable for the spread of harmful narratives.

The committee’s concerns are particularly acute in light of the 2024 Southport riots, where misinformation spread like wildfire on social media platforms, exacerbating tensions and fueling violence. The report highlights the limitations of the Online Safety Act, even if it had been fully implemented at the time of the riots. Ofcom, the designated regulatory body, would have lacked the power to penalize platforms like X (formerly Twitter) and Meta (Facebook’s parent company) for the misinformation associated with the unrest. This powerlessness stems from the Act’s narrow focus on illegal content, leaving “legal but harmful” misinformation largely unchecked.

The committee’s investigation delved into the potential impact of the Act had it been in force during the riots. Representatives from major social media platforms, including Meta, TikTok, and X, were unable to articulate how the Act would have altered their response to the unfolding crisis. Ofcom, while acknowledging the Act’s limitations regarding “legal but harmful” content, suggested that platforms would have faced increased scrutiny regarding their risk assessments and crisis response mechanisms. However, the committee remained unconvinced, concluding that the Act, even in its fully implemented form, would have had minimal impact on curbing the spread of misinformation that contributed to the violence and hatred during the summer of 2024.

The report directly challenges the government’s assertions about the Act’s efficacy. Baroness Jones, the minister responsible for online safety, argued before the committee that the Act would have made a “real and material difference” by empowering Ofcom to demand the removal of illegal posts. However, the committee dismissed this argument, emphasizing the Act’s failure to address the broader issue of misinformation, which often falls within the realm of legality. This discrepancy between the government’s confidence in the Act and the committee’s skepticism highlights a critical gap in understanding the nature and impact of онлайн harms.

The committee’s findings underscore the complex challenges of regulating online content in an era of rapid information dissemination. The inherent tension between freedom of expression and the need to protect individuals and society from harmful content presents a significant dilemma for policymakers. The report highlights the inadequacy of current regulatory frameworks in addressing the nuances of online harms, particularly in the context of misinformation, which can erode trust, fuel social division, and incite violence. The committee’s critique calls for a more robust and comprehensive approach to online safety that goes beyond simply removing illegal content and addresses the root causes of misinformation.

Expert opinions further reinforce the committee’s concerns. Jake Moore, a global cybersecurity advisor at ESET, points to the inherent incentives driving social media platforms to amplify engaging content, often regardless of its veracity or potential harm. The lack of transparency surrounding the algorithms that govern content distribution further complicates regulatory efforts. Moore emphasizes the need for greater transparency and independent audits of these algorithms to enable effective intervention and mitigation of online harms. This call for greater transparency echoes the committee’s concerns about the limitations of the current regulatory framework and the need for a more proactive and comprehensive approach to online safety. The Southport riots serve as a stark reminder of the real-world consequences of unchecked misinformation, highlighting the urgent need for effective solutions to protect individuals and society from the dangers of online harms.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.