Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Social Media»AI-Generated Imagery and Emotionally Manipulative Language Used by Bots to Influence Online Discourse During Federal Election
Social Media

AI-Generated Imagery and Emotionally Manipulative Language Used by Bots to Influence Online Discourse During Federal Election

Press RoomBy Press RoomApril 21, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Australia Grapples with Deluge of Fake Social Media Accounts During Election Campaign

The integrity of Australia’s recent election campaign has been called into question following a revelation of widespread disinformation tactics employed on social media platforms. A report by disinformation detection company Cyabra has uncovered a significant presence of fake accounts on X (formerly Twitter) actively participating in political discussions, reaching millions of Australian voters. These accounts, estimated to comprise nearly one-fifth of election-related profiles analyzed, employed artificial intelligence-generated images and emotionally manipulative language to disseminate biased narratives. One particularly active account, posting over 500 times, reached an audience of approximately 726,000 users, demonstrating the scale and potential impact of these coordinated disinformation campaigns.

Cyabra’s "Disinformation Down Under" report details how these bot accounts targeted both Prime Minister Anthony Albanese and Opposition Leader Peter Dutton with distinct strategies. While the fake accounts aimed to discredit Albanese and undermine his political standing by amplifying messages about the Labor government’s alleged incompetence, economic mismanagement, and progressive policies, the opposition was targeted with pro-Labor narratives, creating a false impression of widespread support for the incumbent administration. The bots employed hashtags like "Labor fail" and "Labor lies" while also resorting to ridicule and name-calling, further fueling the polarized online environment. Conversely, fake profiles sought to portray Dutton as out of touch and inept while labeling the coalition as broadly incompetent and corrupt. This two-pronged approach maximized the spread of disinformation and contributed to the erosion of public trust in the political process.

The sophistication of these disinformation campaigns is evident in the bots’ strategic use of emotionally charged language, satire, and memes to maximize visibility and engagement. By exploiting the virality of such content, the fake accounts were able to effectively disseminate their fabricated narratives and manipulate the online conversation. The analysis, conducted throughout March, used AI technology to identify patterns of inauthentic activity, including posting frequency, language usage, and hashtags employed. This revealed coordinated efforts to push specific narratives designed to sway public opinion. The sheer volume of bot activity at times eclipsed genuine user engagement, allowing these fake accounts to dominate the narrative and drown out authentic voices.

The report highlights the significant implications of these findings for electoral integrity. The ability of malicious actors to create and deploy large numbers of fake accounts to spread disinformation poses a serious threat to democratic processes. By manipulating online discourse, these actors can potentially influence public opinion, suppress legitimate voices, and create an environment of distrust and division. The fact that these bots were able to reach such a large audience underscores the vulnerability of social media platforms to manipulation and the urgent need for more effective measures to combat disinformation.

While the impact of these disinformation campaigns on the election outcome remains difficult to quantify, the sheer scale of the operation raises serious concerns. The manipulation of online discourse through coordinated bot activity can erode public trust in democratic institutions and processes. Furthermore, the emotional nature of the content disseminated by these accounts can exacerbate existing societal divisions and fuel political polarization. The findings of this report serve as a wake-up call for social media platforms, policymakers, and the public to address the growing threat of disinformation and protect the integrity of democratic elections.

The increasing use of AI in generating fake profiles and content poses a significant challenge to electoral integrity. While the prevalence of actual incidents impacting elections in 2024 was relatively low, according to the Australian Electoral Commission, the potential for manipulation remains a serious concern. The difficulty in identifying the individuals or groups orchestrating these campaigns further complicates the issue. Addressing this growing threat effectively requires a multi-faceted approach involving increased platform accountability, enhanced media literacy among the public, and robust legal frameworks to deter and punish those engaging in disinformation tactics. The future of democratic elections hinges on the ability to ensure that public discourse is not hijacked by malicious actors seeking to undermine trust and manipulate outcomes.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Turkish Media Outlets Disseminate Information Contradicting the Joint Media Platform

September 25, 2025

Combating Gendered Disinformation in Rural India Through a Novel Partnership

September 25, 2025

Rapid Dissemination of Misinformation Following Shootings: The Challenge of Real-Time Evidence and Ideologically Driven Narratives

September 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.