Kremlin-backed Disinformation Campaign “Matryoshka” Falters Despite Technical Sophistication
The Russian disinformation operation known as “Matryoshka” or “Overload” is struggling to gain traction on social media platforms despite its evolving tactics and technical prowess. A recent analysis by the Institute for Strategic Dialogue (ISD) reveals that while the Kremlin-backed campaign has expanded its reach to platforms like X (formerly Twitter), TikTok, and Bluesky, its impact remains minimal. This lack of effectiveness is attributed to a combination of predictable content easily flagged by automated moderation systems and a failure to generate genuine engagement from real users.
The ISD report, focusing on the second quarter of 2025, identified approximately 300 accounts linked to the “Matryoshka” operation. These accounts employed sophisticated techniques, including the use of fake logos from reputable media outlets like Euronews, BBC, and DW, in an attempt to lend credibility to their disinformation efforts. The campaign’s narratives primarily focused on undermining trust in Western institutions and disseminating false information. However, the vast majority of these posts were swiftly removed by platform moderators. X removed 73% of detected posts, while TikTok and Bluesky boasted even higher removal rates exceeding 90%. This success in content moderation suggests that the operation’s tactics have become increasingly predictable, allowing automated systems to identify and remove the disinformation before it gains widespread visibility.
A key shift in the “Matryoshka” operation’s strategy involves a move away from direct election interference and towards long-term influence campaigns targeting specific countries, particularly Moldova and Ukraine. Moldova’s pro-Western Prime Minister Maia Sandu became a primary target, with the operation spreading false accusations of corruption and incompetence while portraying Moldova as a weak and chaotic state. Despite the volume of content generated, the campaign failed to resonate with audiences. The rapid removal of posts coupled with a lack of organic engagement indicates a significant failure to achieve the operation’s objectives.
In an attempt to revitalize its flagging influence, “Matryoshka” has adopted several new tactics. These include a surge in English-language content and an increased presence on the video-sharing platform TikTok. On TikTok, operatives often impersonated journalists from established news organizations like Euronews, Reuters, and France24, utilizing AI-generated profile pictures to enhance their credibility. However, these efforts largely fell flat, with most accounts failing to attract followers or generate genuine engagement. Telltale signs of artificial engagement, such as TikTok videos displaying significantly more likes than views, further exposed the operation’s attempts to manipulate the platform’s algorithms.
While Telegram remains a key platform for the operation, its audience within this space appears to be predominantly Russian-speaking, suggesting a potential shift towards domestic influence efforts. This limited international reach further underscores the campaign’s struggles to penetrate non-Russian speaking audiences and achieve broader impact. The ISD analysis highlights the operation’s overall failure to engage real users and generate widespread reach across various platforms. On Bluesky, for instance, posts averaged less than one like each, demonstrating the lack of genuine interest in the disseminated content. Similarly, the use of bots on platforms like X and TikTok created only a superficial illusion of engagement, failing to significantly influence the broader information landscape.
The ISD experts caution against excessive media attention to such campaigns, warning that it could inadvertently amplify their influence by granting them undeserved significance. They recommend focusing primarily on instances where disinformation operations achieve genuine viral spread or demonstrate novel tactical developments. This measured approach prevents undue amplification of ineffective campaigns and allows resources to be focused on addressing truly impactful disinformation threats. In conclusion, despite its persistent efforts to refine its tactics, expand its reach, and adopt new technologies, the “Matryoshka” operation continues to be largely ineffective. The robust countermeasures implemented by social media platforms, combined with the operation’s predictable content and failure to generate organic engagement, have significantly hampered its ability to achieve its disinformation objectives. The campaign serves as a case study in the limitations of technically sophisticated disinformation operations when confronted with effective platform moderation and a discerning online audience.