Close Menu
DISADISA
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
Trending Now

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram YouTube
DISADISA
Newsletter
  • Home
  • News
  • Social Media
  • Disinformation
  • Fake Information
  • Social Media Impact
DISADISA
Home»Disinformation»Russian Data Injection Campaign Targets Large Language Models
Disinformation

Russian Data Injection Campaign Targets Large Language Models

Press RoomBy Press RoomMarch 7, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Russian Disinformation Network ‘Pravda’ Targets AI Chatbots to Spread Propaganda, Study Finds

A sophisticated Russian disinformation network, dubbed "Pravda" (meaning "Truth" in Russian), has been strategically targeting AI chatbots to disseminate pro-Kremlin propaganda, a new report by the analysis group NewsGuard reveals. This operation goes beyond simply flooding the internet with false narratives; its primary objective is to manipulate the training data of these powerful AI tools, effectively poisoning the well of information they draw from. The campaign’s success is alarming, with leading chatbots like OpenAI’s ChatGPT-4, Anthropic’s Claude, Meta AI, Google’s Gemini, and Microsoft’s Copilot reproducing Pravda’s narratives in a significant portion of their responses.

Pravda’s tactics involve churning out vast quantities of content, with an estimated 3.6 million articles published last year alone. This deluge of information, much of it based on recycled material from pro-Kremlin sources including Russian state media, is then ingested by the algorithms that train AI chatbots. This deliberate "LLM grooming," as it’s been termed, aims to influence the very fabric of these language models, shaping their understanding of events and ultimately influencing the information they provide to users. The strategy underscores a chilling shift in disinformation tactics, moving from targeting human audiences to manipulating the underlying technologies that shape our access to information.

NewsGuard’s investigation unveiled a vast network of approximately 150 websites connected to Pravda. This network strategically targets diverse audiences globally, with sites focusing on Ukraine, Europe, Africa, the Pacific region, the Middle East, North America, the Caucasus, and Asia. The sites operate in multiple languages and often employ deceptive domain names, incorporating names of Ukrainian cities and regions like News-Kiev.ru and Kherson-News.ru, to lend a veneer of local credibility. This sprawling network allows Pravda to amplify its message across geographical and linguistic boundaries, further increasing its potential impact on AI training data.

Over the course of the war in Ukraine, Pravda has propagated over 200 disinformation narratives, including false claims about U.S. biolabs in Ukraine and accusations of misuse of U.S. military aid by Ukrainian President Volodymyr Zelenskyy. The sheer volume of these narratives, combined with their strategic dissemination through the Pravda network, contributes to the growing risk of AI chatbots accepting them as factual. This poses a significant threat to the integrity of information disseminated by these increasingly influential tools, potentially shaping public perception and influencing decision-making on a global scale.

Experts warn of the long-term dangers posed by this form of AI manipulation. As false narratives proliferate online, the likelihood of AI models incorporating them into their responses increases exponentially. This creates a feedback loop where disinformation is not only spread but also legitimized by the very tools designed to provide accurate information. The implications are far-reaching, potentially undermining trust in AI-powered technologies and contributing to the erosion of informed public discourse.

The findings of NewsGuard’s report come at a critical juncture, coinciding with reports of a pause in U.S. Cyber Command’s activities targeting Russia. This pause raises concerns about the vulnerability of information ecosystems to sophisticated disinformation campaigns like Pravda’s. The report underscores the urgent need for robust countermeasures to address this emerging threat and protect the integrity of AI-powered information platforms. Combating LLM grooming requires a multi-pronged approach, including improved detection and filtering of disinformation within training datasets, as well as increased transparency in the algorithms used by AI chatbots. The challenge lies in striking a balance between protecting against manipulation and upholding the principles of free and open access to information. The stakes are high, as the battle against disinformation increasingly shifts to the digital battleground of artificial intelligence.

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Read More

Russian Disinformation Campaign Targets Moldova’s Upcoming Elections

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

Contested Transitions: The Siege of Electoral Processes

September 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Our Picks

Combating Misinformation About Judaism: A New Podcast by Two Teenagers

September 25, 2025

CPD: Russia Disseminates Disinformation Regarding Global Conflict Following Alleged Downing of NATO Aircraft

September 25, 2025

The Impact of Flagged Misinformation on Social Media Engagement

September 25, 2025

Paige Bueckers’ On-Court Impact Drives Historic Social Media Milestone with Dallas Wings

September 25, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Don't Miss

Disinformation

Contested Transitions: The Siege of Electoral Processes

By Press RoomSeptember 25, 20250

Moldova’s Democracy Under Siege: A Deep Dive into the Information War Moldova, a small Eastern…

Navigating Misinformation: Introducing “The Reality Check” Series

September 25, 2025

Telegram Serves as Primary News Source for Half of Ukrainian Population, Survey Reveals

September 25, 2025

Obama Denounces Trump’s Dissemination of Harmful Misinformation Regarding Autism and Tylenol.

September 25, 2025
DISA
Facebook X (Twitter) Instagram Pinterest
  • Home
  • Privacy Policy
  • Terms of use
  • Contact
© 2025 DISA. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.