Meta Under Fire for Profiting from Pro-Russian Disinformation Campaign
Social media behemoth Meta, parent company of Facebook, has come under intense scrutiny after researchers revealed the platform profited handsomely from a sophisticated pro-Russian disinformation network known as “Doppelganger.” A collaborative investigation by Check First, Reset Tech, and AI Forensics uncovered that Meta received $338,000 between August 2023 and November 2024 for hosting at least 8,000 sponsored content pieces linked to the operation. This revelation has sparked outrage and raised serious concerns about Meta’s compliance with international sanctions and its role in amplifying harmful propaganda.
Doppelganger, first identified in 2022, initially mimicked reputable Western media outlets to disseminate anti-Ukraine and anti-Western narratives. The network’s tactics evolved to include sponsored content on Facebook, leveraging the platform’s vast reach to target tens of thousands of users across France, Germany, Poland, and Italy. The sponsored posts, often presented as cartoons mocking European politicians or messages criticizing European aid to Ukraine, sought to sow discord and undermine public support for Ukraine amidst the ongoing conflict.
The investigation, detailed in the report “Influence by Design,” points to the Social Design Agency (SDA), a Russian company sanctioned by the European Union, the United States, and Britain in July 2023 for its alleged involvement in Doppelganger. Despite these sanctions, the report alleges that Meta continued to review, approve, and distribute advertisements linked to the SDA, raising critical legal questions about the platform’s adherence to international sanctions frameworks. Researchers believe the actual number of sponsored posts is likely far higher than the 8,000 identified, as their analysis focused solely on information gleaned from leaked SDA documents.
Meta, while not explicitly naming Doppelganger, acknowledged the existence of a Russia-linked "coordinated influence campaign" on Facebook in September 2022. In response to the recent revelations, Meta referenced previous reports on Russian-linked digital threats, including a mid-2024 report admitting the presence of Doppelganger-related ads on its platform. The company maintains that it was the "first tech company to uncover the campaign" and has blocked tens of thousands of related posts. However, critics argue that this response falls short of addressing the core issue of Meta’s financial gain from the disinformation campaign.
Doppelganger’s operations have expanded beyond traditional posts and Facebook ads to encompass a wider range of social media platforms. Researchers have observed its presence on platforms like Bluesky, a Twitter alternative, where it employs tactics such as AI-generated profile photos, identical biographies, and coordinated replies to amplify its messages. These messages often exploit real-world problems, exaggerating and distorting them to further pro-Russian narratives, such as blaming Western powers for the conflict in Ukraine and criticizing the economic burden of aid on European countries.
While the reach of Doppelganger across all platforms remains relatively limited, the sophistication of its tactics and its persistence raise concerns. Experts note the irony that media coverage denouncing the operation, along with research reports exposing its objectives, inadvertently contributes to its visibility and perceived legitimacy. The ongoing challenge for social media platforms like Meta is to effectively combat disinformation campaigns like Doppelganger without inadvertently amplifying their messages, striking a delicate balance between protecting free speech and preventing the spread of harmful propaganda. The financial incentives associated with hosting such content further complicate this challenge, highlighting the need for stricter regulations and greater transparency in the fight against online disinformation.