Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»News»Parliamentary Inquiry into Online Misinformation by X, TikTok, and Meta
News

Parliamentary Inquiry into Online Misinformation by X, TikTok, and Meta

Press RoomBy Press RoomFebruary 27, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Social Media Giants Face Parliamentary Scrutiny Over Handling of Misinformation During Southport Riots

The House of Commons Science, Innovation and Technology Committee (SITC) convened a hearing on February 25, 2025, to address the pervasive issue of online misinformation and the role of algorithms in amplifying harmful content. The committee’s focus centered on the Southport Riots of 2024, sparked by the tragic stabbing of three girls and fueled by false rumors circulating on social media platforms. Representatives from Meta, TikTok, and X (formerly Twitter) were called upon to defend their content moderation practices and explain their responses to the rapid spread of disinformation that contributed to the unrest. Committee Chair Chi Onwurah emphasized the intense public interest in the issue, highlighting concerns about misinformation being disseminated on an industrial scale.

The Southport Riots served as a stark example of the real-world consequences of online misinformation. Following the stabbings, baseless rumors falsely identifying the perpetrator as a Muslim asylum seeker quickly proliferated across social media. This misinformation ignited a wave of Islamophobic violence in numerous English towns and cities, targeting mosques, asylum seeker accommodations, immigration centers, and individuals perceived to be of color. The committee sought to understand how social media companies responded to this crisis and what measures they took to counter the spread of harmful content.

Meta, TikTok, and X detailed their efforts to remove content that violated their respective community guidelines. Chris Yiu, Meta’s Director of Public Policy for Northern Europe, reported the removal of approximately 24,000 posts for violating policies on violence and incitement, and an additional 2,700 posts related to dangerous organizations. Alistair Law, TikTok’s Director of Public Policy for the UK and Ireland, highlighted the removal of tens of thousands of posts containing violent comments. Wilfredo Fernández, X’s Senior Director for Government Affairs, emphasized the platform’s "community notes" feature, which aims to provide contextual information to users. However, all three representatives acknowledged the challenges of establishing factual accuracy in rapidly evolving situations.

Despite these efforts, committee members raised concerns about the effectiveness of existing content moderation practices. MPs highlighted instances of blue-tick verified X accounts sharing the locations of immigrants and encouraging violence, which seemingly escaped community notes moderation. Fernández conceded that the platform does not always make the correct call but maintained that they took action on tens of thousands of posts. Labour MP Emily Darlington shared her personal experience of receiving violent threats on X, prompting Fernández to acknowledge the abhorrent nature of the comments but stopping short of guaranteeing their removal.

Meta’s decision to replace third-party fact-checking with a community notes approach drew criticism from the committee, with MPs arguing that this could facilitate the spread of racist misinformation. Yiu defended the move, stating that it aimed to allow for a wider range of challenging conversations while acknowledging the need for balance. Onwurah and Darlington countered that certain topics, such as denying the existence of trans people or denigrating immigrants, should not be considered open for debate. While Meta and TikTok representatives emphasized their high removal rates of violent content (upwards of 98%), questions remained about the adequacy of these measures.

The looming implementation of the Online Safety Act (OSA) was a key topic of discussion. The OSA places new obligations on tech companies to tackle disinformation and harmful content, including hate speech, incitement to violence, and certain forms of disinformation. Ofcom, the online harms regulator, emphasized the importance of platforms’ systems and processes in addressing illegal content, noting that it would have a range of enforcement powers, including significant financial penalties. However, at the time of the Southport riots, the OSA’s criminal offences related to threatening communications and non-compliance with information notices were already in force, raising questions about their applicability to the incitement of violence through social media. While some legal experts argued that existing Public Order Act offences were sufficient, others questioned the practical application of these laws in the fast-paced online environment. The social media companies maintained that they already had robust processes and systems in place to address misinformation crises, regardless of the OSA. However, the committee’s inquiry underscored the ongoing debate over the efficacy of these measures and the need for continuous improvement in tackling the complex challenge of online misinformation. The Southport Riots served as a tragic reminder of the potential for online falsehoods to incite real-world violence, placing increasing pressure on social media platforms to refine their content moderation practices and prevent the spread of harmful content. The effectiveness of the OSA in addressing these challenges remains to be seen.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.