TikTok’s Climate Misinformation Problem: A COP29 Case Study

The Conference of the Parties (COP), the annual United Nations climate change summit, serves as a critical platform for international collaboration and policy development to address the global climate crisis. Ensuring access to accurate and reliable information during this period is paramount. While TikTok, a prominent social media platform, explicitly prohibits climate misinformation in its community guidelines, a recent investigation reveals a concerning gap between policy and practice.

The investigation focused on comments posted on videos about COP29 published by major news organizations on TikTok. Twenty comments explicitly denying climate change or its human-made origins were identified and reported through TikTok’s in-house reporting tool. These comments appeared on videos from reputable news sources like the BBC, ITV, and Channel 4 News, collectively garnering over 3 million views. The comments’ visibility raised concerns about the potential for misinformation to contaminate reliable climate information, particularly as TikTok increasingly becomes a news source for many.

Despite TikTok’s stated policy against climate denial content, the platform’s moderation system failed to remove the vast majority of reported comments. Only one comment was initially removed for violating community guidelines. Another virtually identical comment remained untouched, while the rest received responses stating no violation had occurred. This apparent inaction raises serious questions about the adequacy of TikTok’s content moderation resources and investment, especially during high-profile events like COP29 when climate misinformation is likely to proliferate.

The risks associated with climate misinformation on TikTok are compounded by several factors. Research indicates that a significant portion of TikTok users engage with comments on videos, highlighting the potential influence of misinformation presented in this format. Furthermore, the prominence of the flagged comments – appearing on high-profile news organization videos often recommended in COP29 search results – underscores the direct targeting of credible news sources and the ease with which users might encounter false information. The timing of this investigation, coinciding with reports of planned layoffs within TikTok’s content moderation team, further fuels concerns about the platform’s commitment to combating misinformation.

While TikTok deserves credit for acknowledging the threat of climate misinformation in its policies and providing reporting mechanisms, this investigation demonstrates a clear discrepancy between policy and enforcement. Although TikTok displays a "#COP29 For Climate Action" banner directing users to trusted information on the UNFCCC website, the failure to effectively moderate climate denial comments undermines these efforts. Furthermore, a TikTok content moderator revealed an increasing reliance on automated and outsourced moderation, which they believe has contributed to the proliferation of misinformation and hate speech on the platform. This shift away from human moderation raises concerns about the platform’s capacity to effectively address nuanced and context-dependent issues like climate misinformation.

The investigation’s findings call for urgent action from TikTok. The platform must thoroughly investigate its failure to act on reported climate denial content and ensure its moderation systems effectively protect against misinformation. This necessitates adequately resourcing content moderation efforts globally, including providing fair wages, unionization rights, and psychological support for moderators. Furthermore, TikTok must robustly enforce its mis- and disinformation policies for both organic and paid content, prioritizing human-led moderation to address the complexities of climate denial. The platform’s response to the investigation, while acknowledging the removal of the flagged comments, highlights a reactive approach rather than a proactive strategy to combat misinformation. The discovery of further climate denial comments on a video endorsed by TikTok For Good underlines the pervasive nature of the problem and the need for a more comprehensive and robust approach to content moderation. The call for greater investment in human moderators echoes the concerns of experts and union representatives who advocate for a balanced approach between automated and human oversight to ensure user safety and information integrity on the platform.

Share.
Exit mobile version