Bot Accounts Spread Climate Disinformation and Fuel Political Divisions
A recent investigation has uncovered a network of bot-like accounts on X (formerly Twitter) actively engaged in spreading climate disinformation and exacerbating political polarization. These accounts, identified through a methodology that flags suspicious online behavior such as high posting frequency, low original content, and rapid account creation in response to events, were found to be amplifying climate denial narratives and conspiracy theories. The investigation revealed that a significant number of these accounts frequently used hashtags like #ClimateScam and #ClimateCult, often alongside other conspiracy-related hashtags. This activity points to a coordinated effort to undermine climate action and sow distrust in scientific consensus.
Conspiracy Theories and Attacks on Climate Action
Beyond simply denying the reality of climate change, these bot-like accounts propagated alarming conspiracy theories, portraying climate action proponents as a dangerous cult intent on harming humanity and the natural world. These accounts disseminated claims that renewable energy infrastructure like solar panels and wind farms cause significant environmental damage. They also amplified more extreme narratives suggesting that climate action is a deliberate plot to reduce the quality of life and even that "geoengineering" is being used to intentionally sicken populations. This type of disinformation aims to instill fear and mistrust towards climate initiatives and those who support them.
Climate Action Portrayed as a Power Grab by Elites
Another prominent theme amplified by these accounts was the idea that climate policies are a deceptive tool for elites and governments to amass wealth and control populations. They framed environmental regulations as a means of surveillance and restriction of freedoms, referencing “blade runners” – individuals vandalizing traffic cameras associated with low-emission zones – as symbols of resistance. This narrative seeks to cast climate action as a sinister plot against individual liberty and economic well-being.
Climate Discourse Entangled with Political Identity
While climate change discussions were not the sole focus of these bot-like accounts, the topic was consistently present and often intertwined with other political issues. The accounts frequently expressed opposition to climate action while simultaneously expressing stances against LGBTQIA+ rights, vaccination, and involvement in conflicts like the Ukraine war. Conversely, support for climate action, social justice issues, and international cooperation was portrayed as a hallmark of a nefarious "globalist" or "far-left fascist" agenda. This deliberate linking of climate change with other divisive issues further fuels political polarization and creates an "us vs. them" mentality.
Interaction with the Wider Information Environment
These bot-like accounts did not operate in isolation. They frequently shared links to external websites known for spreading misinformation and conspiracy theories. They also interacted with other X users, some with substantial follower counts, further amplifying their messages within the platform’s ecosystem. Evidence also suggests connections between some of these accounts and other profiles that boost their content, suggesting a coordinated network designed to maximize the reach of their disinformation campaign. This interconnectedness highlights the potential for bot-like activity to contaminate the broader information landscape and influence public perception.
Call for Platform Accountability and Transparency
The proliferation of bot-like accounts spreading climate disinformation poses a significant threat to informed public discourse and effective climate action. Social media platforms like X have a responsibility to address this issue and enforce their policies against platform manipulation. While X claims to have investigated the identified accounts and found no evidence of manipulation, they have not provided sufficient transparency regarding their methods or findings. This lack of accountability highlights the urgent need for greater platform transparency and robust regulations to combat the spread of disinformation and protect democratic processes from manipulation. The ability of researchers and the public to effectively scrutinize platform activity is hampered by limited data access, emphasizing the need for legislation mandating greater transparency.