Parliamentary Inquiry to Summon Elon Musk, Meta, and TikTok Executives Over UK Riots and Disinformation
A parliamentary inquiry into the UK riots and the proliferation of harmful AI-generated content will summon Elon Musk, owner of X (formerly Twitter), to testify, along with senior executives from Meta and TikTok. The Commons science and technology select committee aims to scrutinize the role of social media platforms in spreading disinformation and the potential impact of rapidly advancing technologies on UK online safety laws. The hearings, scheduled for the new year, will delve into the consequences of generative AI, particularly its use in disseminating incendiary images that fueled Islamophobic protests following the tragic killing of three schoolgirls in Southport.
The committee’s investigation will focus on the business models employed by Silicon Valley companies, which some critics argue incentivize the spread of misleading and harmful content. Labour MP Chi Onwurah, chair of the select committee, expressed her intention to question Musk on his seemingly contradictory stance on freedom of expression and the proliferation of disinformation on his platform. This invitation comes after Musk was notably absent from a UK government international investment summit in September, a snub he publicly criticized.
Musk’s potential appearance before the committee is uncertain, given his increasingly strained relationship with the UK government. The tech billionaire has been critical of the Labour government, even invoking Stalinist comparisons in response to proposed inheritance tax changes. He also made controversial remarks during the riots following the Southport killings, predicting “inevitable” civil war. Musk’s potential involvement in the Trump White House further complicates his availability and willingness to cooperate with the UK inquiry.
The inquiry coincides with a period of upheaval in the social media landscape, marked by user migration from X to alternative platforms like Bluesky. Many users cite concerns about misinformation, the return of previously banned figures like Tommy Robinson and Andrew Tate, and X’s updated terms of service, which allow the platform to use user data for AI model training. While Keir Starmer has indicated no plans to join Bluesky, the Prime Minister emphasized the government’s need to maintain broad reach and communication across various platforms.
The tensions between Musk and the UK government were further highlighted by Musk’s public condemnation of the UK’s handling of convicted offenders and his criticism of imprisoning individuals for social media posts, following his exclusion from the investment summit. One such individual, Lucy Connolly, was jailed for a post on X deemed to incite racial hatred, despite the platform itself not finding the post in violation of its rules. This case underscores the complexities of regulating online content and highlights the challenges faced by platforms in balancing freedom of expression with the prevention of harmful content.
The parliamentary inquiry will seek to understand the intricate relationship between social media algorithms, generative AI, and the dissemination of false or harmful content. It will also examine the use of AI in search engines, particularly following recent instances where Google’s AI generated racist and inaccurate information. The committee’s investigation aims to provide insights for strengthening UK online safety regulations, particularly in light of the forthcoming rules under the Online Safety Act. These rules will place greater responsibility on social media companies to prevent the spread of illegal material and mitigate safety risks, including addressing content that incites violence, hatred, or spreads harmful misinformation. The inquiry’s findings will be crucial in shaping future policy and ensuring the online environment is safer and more accountable.