Meta Under Fire: Pro-Russian Disinformation Campaign Generates Revenue and Raises Concerns

Meta, the parent company of Facebook, is facing scrutiny after researchers revealed that the social media giant profited from a pro-Russian disinformation network known as "Doppelganger." The network, active since 2022, disseminated thousands of sponsored posts containing anti-Ukraine and anti-Western narratives, generating hundreds of thousands of dollars in revenue for Meta. This revelation sparks serious questions about Meta’s content moderation practices and its compliance with international sanctions against Russian entities involved in disinformation campaigns.

The Doppelganger operation employed sophisticated tactics to spread its message across multiple social media platforms. Initially mimicking established Western media outlets, the network later branched out into paid advertisements on Facebook, leveraging the platform’s vast reach to target users in France, Germany, Poland, and Italy. The sponsored content often took the form of cartoons mocking European politicians or messages criticizing European aid to Ukraine, aiming to sow discord and undermine public support for the war-torn nation.

Researchers at Check First, Reset Tech, and AI Forensics, who published the "Influence by Design" report, allege that Meta received $338,000 for at least 8,000 pieces of sponsored Doppelganger content between August 2023 and November 2024. They believe the actual number of posts is likely much higher, as their analysis was limited to information gleaned from leaked documents from the Social Design Agency (SDA), one of the Russian companies implicated in the disinformation campaign. Both the SDA and another linked Russian entity were sanctioned by the EU, US, and UK in 2023 for their involvement in spreading disinformation.

Despite these sanctions, the report claims that Meta continued to approve and distribute SDA-linked advertisements. This raises crucial legal questions about Meta’s adherence to international sanctions frameworks and its responsibility to prevent its platform from being used for malicious purposes. Critics argue that Meta’s actions, whether intentional or through negligence, effectively provided financial support to a sanctioned entity actively engaged in spreading pro-Kremlin propaganda.

Meta has responded to these allegations by referencing previous reports on Russia-linked digital threats, including one from mid-2024 acknowledging the presence of Doppelganger-related ads. The company maintains that it was the first tech company to uncover the Doppelganger campaign and that it has blocked tens of thousands of related posts. However, these assurances fail to address the core issue of how sponsored content from a sanctioned entity was able to bypass Meta’s moderation systems and generate substantial revenue.

The Doppelganger campaign’s evolution highlights the constantly shifting landscape of online disinformation. Beyond Facebook, the operation has expanded to other platforms, including the newer social media network Bluesky. Its tactics have adapted to leverage trending topics and exploit real-world concerns, amplifying anxieties and fueling social divisions. Doppelganger frequently employs bot accounts with AI-generated profiles and identical biographies to artificially boost engagement and spread its messages further. This coordinated approach demonstrates a concerning level of sophistication and adaptability.

The relative lack of widespread audience engagement with Doppelganger content presents a paradox. While the campaign’s reach may be limited in terms of direct impact, the media attention and research reports exposing its tactics inadvertently contribute to its notoriety. This publicity, ironically, bolsters the narrative that the operation is a significant threat, lending it an undeserved degree of credibility. This highlights the challenge of combating disinformation without inadvertently amplifying its message. The ongoing struggle against sophisticated operations like Doppelganger underscores the need for continued vigilance, improved platform accountability, and enhanced public awareness of disinformation tactics. Ultimately, a collaborative effort involving tech companies, researchers, policymakers, and the public is essential to effectively counter the spread of malicious information online.

Share.
Exit mobile version