Google Under Scrutiny for Alleged Role in Monetizing Misinformation That Fueled Southport Riots
A parliamentary committee is investigating Google’s potential role in monetizing a website that spread misinformation about the identity of the attacker in the Southport murders, which sparked widespread riots last summer. The Science, Innovation and Technology Committee revealed that an unpublished report from digital advertising watchdog Check My Ads alleges Google’s advertising network profited from a site publishing false information about the attacker. This misinformation, including false claims about the attacker’s asylum seeker status, contributed to the violence that erupted across England following the tragic murders of three girls in Southport.
The committee’s chair, Chi Onwurah MP, confronted Google’s managing director for trust and safety in Europe, Amanda Storey, with these allegations during an evidence session on misinformation. Storey responded that if true, such monetization would violate Google’s policies and promised a thorough investigation. She acknowledged the gravity of monetizing low-quality information, especially in connection with such a horrific real-world event. Storey emphasized the challenges of combating the rapid spread of misinformation in fast-moving situations, particularly the "echo" effect across various online platforms.
While expressing sympathy for the victims and families, Storey asserted that Google’s position as a search engine, distinct from social media platforms, made it less susceptible to the spread of misinformation. She suggested that Google’s existing policies and approaches mitigated its involvement in the Southport incident. However, she conceded that the upcoming Online Safety Act, with its stricter codes of practice and hefty penalties for non-compliance, would likely have made a difference in limiting the spread of misinformation during the riots.
Committee member Emily Darlington MP expressed concern over Google’s seeming attempt to distance itself from the misinformation surrounding the Southport attack. She directly challenged Storey on whether Google had reflected on its role in monetizing content that potentially fueled the riots. Storey confirmed that Google conducts thorough post-mortem analyses of real-world harm events, including "root cause and corrective action assessments," to identify areas for improvement in its policies and enforcement. She pledged to share more details of this process with the committee.
The allegations against Google raise critical questions about the responsibility of tech companies in preventing the spread of misinformation, particularly during volatile events. The case underscores the challenges of managing the complex online ecosystem where misinformation can rapidly proliferate and incite real-world violence. The committee’s investigation and the impending Online Safety Act signal a growing determination to hold tech giants accountable for the content they host and monetize.
The incident also highlights the urgent need for improved mechanisms to identify and counter misinformation in real-time, especially during rapidly evolving crises. It calls for a collaborative approach involving tech companies, regulators, and civil society to develop effective strategies for protecting online users from harmful content and mitigating its real-world consequences. The outcome of this investigation and the implementation of the Online Safety Act will have significant implications for the future of online content moderation and the responsibility of tech platforms in combating misinformation.