Stolen voices: Russia-aligned operation manipulates audio and images to impersonate experts
6 May 2025
The Russia-aligned influence campaign Operation Overload, also referred to as Matryoshka and Storm-1679, is known for impersonating media outlets, academics and law enforcement. It aims to shift public opinion on the war in Ukraine and sow division in democratic countries. The operation’s posts often tag several media or research organisations, likely to divert their attention from other investigations. It operates across multiple platforms, including X, Telegram, and Bluesky.
This ISD Dispatch provides an overview of Operation Overload’s activities during the first quarter of 2025, focusing on its latest narratives and targets. ISD reviewed a sample of activity attributed to Operation Overload between January and March 2025. We found that across 135 pieces of content, the operation impersonated more than 80 different organisations. In most cases, it did so by misusing their logos or manipulating the voices of affiliated individuals. Operation Overload’s wide-scale impersonation campaign is ongoing and is capable of producing viral content, damaging the reputations of trusted organisations, and distracting those tagged in its posts.
Key Findings
- Operation Overload is a persistent, high-volume campaign. Its narratives exploit emerging controversies within NATO member countries and Ukraine. In a sample of 135 pieces of content, we found that the operation targeted 10 countries with a focus on Germany, France and Ukraine. Its messaging centred on undermining Ukraine’s war effort and destabilising democracies.
- This latest phase of the operation created videos designed to impersonate trusted sources, mimicking more than 80 organisations in the first three months of 2025. Three-quarters of those organisations were media outlets, universities or law enforcement. Many videos incorporated real footage and AI-generated voices of journalists, professors and police officers.
- Most of the operation’s content received limited engagement, except for one video that falsely claimed USAID paid celebrities to travel to Ukraine, which garnered over 4 million views. All of the other reviewed posts amassed little attention from real users and relied heavily on a bot network to generate likes and shares. Despite its limited reach, the operation’s high output increased its chances of reaching users, distracting the organisations tagged in posts, and damaging the reputations of those it impersonated. ISD and others monitor this operation to measure impact, identify trends, and anticipate future campaigns.
- ISD analysed a sample of the operation’s post on X (formerly Twitter) to assess narrative and tactical patterns, though the operation often posted the same content across multiple platforms, including Telegram and Bluesky. At times, X removed content before analysts recorded it and some posts fell outside the methodology used to surface content, which relied on searches related to narratives and countries commonly targeted by Russian information operations.
New year, same tactics
Operation Overload is a persistent and high-volume Russian information operation. It impersonates trusted sources to sow discord, confusion and distrust. The operation creates and posts videos that appear to be news reports, uses AI to clone experts’ voices, forges newspaper headlines, and manipulates images.
CheckFirst, Recorded Future, the Digital Forensics Research Lab and other organisations have detailed the operation’s past narratives and its efforts to overwhelm journalists and others with debunking requests via social media and email. This ISD Dispatch offers an update on Operation Overload’s campaigns throughout the first three months of 2025, highlighting new messages and targets.
During the first quarter of the year, the operation posted at least 135 pieces of content on X. It targeted 10 countries, posted in 10 languages[1] , and used logos or manipulated voices of people affiliated with over 80 organisations. While its targets varied, there was a clear emphasis on Germany, France and Ukraine. Its narratives focused on weakening NATO countries’ support for Ukraine and disrupting their domestic politics, including by spreading false or misleading information about elections, politicians, and policies related to the economy, education, and health care. To do so, the operation frequently impersonated established institutions; more than three-quarters of its content appeared to originate from media outlets, universities or law enforcement.
Despite these efforts, Operation Overload saw only a single post gain substantial organic engagement on X. That post, which spread lies about USAID funding celebrity trips to Ukraine, generated over 4.2 million views after being shared by a well-known conspiracy account and then amplified by other high-profile account unrelated to the campaign. While that level of virality is rare, the operation could achieve more ‘breakout moments’ in the future. It also presents unique challenges to impersonated individuals and institutions, requiring them to make careful decisions about how to respond. Previously posted content continues to pose risks: roughly 80% of posts ISD identified remain available on X, leaving users vulnerable to misleading content and those impersonated at risk of reputational harm.
Targeting Europe and Ukraine
Operation Overload pivots from country to country to exploit crises and debates. In these first three months, ISD found that three-quarters of its content was aimed at audiences in Ukraine and EU member states, particularly Germany and France.
Germany was targeted most frequently, reflecting the operation’s interest in the federal election, which took place on 23 February. The most impersonated organisation – the German international broadcaster Deutsche Welle (DW) – was also German. However, most posts and videos referencing German political issues did so in other languages, suggesting an effort to undermine Germany’s global standing or reach multilingual German audiences.

Image 1. Operation Overload post making false claims about terror threats to the elections in Germany.
France was the second most targeted country and President Emmanuel Macron was referenced more often than any other politician. Operation Overload frequently created content in French, with roughly one-third of its videos in the language.
Perhaps unsurprisingly given the focus of many recent Russia-aligned information operations, Ukraine was also in the crosshairs. Operation Overload accounts routinely targeted European and US audiences with content portraying the Ukrainian government as belligerent and corrupt.
These trends show Operation Overload’s emphasis on high-stake geopolitical targets, such as elections, prominent politicians who support Ukraine, and Ukraine itself. Like many Russian-aligned information operations, it is focused on inserting false and misleading narratives into conversations that are already dominating international headlines.
Messaging about Ukraine and European politics
Operation Overload’s messages varied, but most focused on two goals: undermining NATO’s backing for Ukraine and creating political chaos.
Nearly a third of Operation Overload’s content sought to undermine NATO’s support for Ukraine. The accounts pushed fake quotes from reputable sources claiming Ukraine is an “insidious provocateur” dragging Europe into World War III. It accused Ukraine of spreading more than 100,000 false reports and launching cyberattacks. Ukrainian officials were said to be fleeing the country and using foreign aid to buy mansions. Ukrainian refugees currently residing in EU countries were framed as having committed arson, theft, harassment and animal abuse.

Image 2. Operation Overload post impersonating law enforcement and fabricating accusations against Ukrainian refugees.

Image 3. Operation Overload post impersonating an academic and sharing fake quotes criticising Ukraine.
In late January, the operation zeroed in on German’s February election. It sought to suppress turnout with fake terror threats from intelligence agencies including the German Federal Intelligence Service (BND), the US’ CIA, the UK’s foreign intelligence service MI6 and Israel’s national intelligence agency Mossad. These posts spread claims that there would be terror attacks at polling stations and that German law enforcement was unprepared to respond. One post claimed voting in the election was “not worth dying for” unless the voter cast a ballot for the Alternative für Deutschland (AfD) party. The operation also accused politicians from the CSU and CDU – part of the current ruling coalition – of pedophilia and corruption.
Operation Overload sought to create division both within NATO members and between them. Much of its content focused on eroding French President Macron’s domestic support, framing him as unpopular and ineffective. Some posts even advocated for his “physical removal” and fabricated news of assassination attempts. The operation also sought to drive a wedge between the US and Europe, including by spreading fabricated quotes from European politicians attacking US President Donald Trump.
Many of Operation Overload’s narratives are too extreme to gain traction. Extravagant lies such as fake assassination plots and pedophilia accusations rarely influenced real conversations. Occasionally, however, the operation slips in subtler claims, such as fake quotes from EU politicians criticising President Trump. These are harder to immediately dismiss and potentially pose a greater risk to democratic discourse.
Impersonating trusted sources
Operation Overload specialises in creating videos that appear as though they were made by trusted organisations. In the three-month span, it used logos or manipulated voices of people associated with 81 organisations. More than three-quarters were media outlets, universities or law enforcement agencies, and they were most frequently French, British or German.

Image 4. Operation Overload post impersonating the Wall Street Journal and sharing false information about USAID funding.

Image 5. Operation Overload post impersonating DW and spreading false information about immigrants in Germany.
Most of these videos were designed to look like media reports by superimposing a media outlet’s logo onto a video clip featuring false claims. In total, the sample of content analysed by ISD used logos from 28 different outlets. DW was the most imitated outlet (its branding appeared in 15 different videos) followed by the BBC, Sky News and Le Point.
Roughly 40 percent of videos took what appears to be real footage of academics, reporters, or law enforcement officers from social media and added AI-generated audio to make it sound like they said something false or inflammatory. Academics had their voices replicated roughly four times more than people from other sectors. They are likely chosen since they are viewed as authoritative experts but are generally unrecognisable to the public. This gives the operation’s content perceived credibility and makes it difficult to identify when speech has been artificially created or extended.
Operation Overload also spread doctored images including fake news headlines from at least eight outlets and false images of pro-Russia graffiti in the US and Europe. One picture showed graffiti on a building in California featuring Ukrainian President Volodymyr Zelensky tearing apart the word “USAID”. This was apparently designed to create the false impression of grassroots opposition to Ukraine in the state.
This wide-scale impersonation campaign advances Russia’s narratives, erodes public confidence in institutions, and undercuts the public’s ability to distinguish reliable information from manipulated content.
Causing harm in multiple ways
Though its overall reach was limited, Operation Overload illustrates the various ways information campaigns can have an impact – ranging from isolated viral moments and reputational harm to deliberate disruption of the research and fact-checking communities.
Operation Overload struggled to gain traction on X with one notable exception: a widely circulated video that claimed USAID funded celebrity trips to Ukraine. After a well-known conspiracy account shared the video, multiple high-profile figures amplified it, helping the video reach more than 4.2 million views.
However, most likes and shares on the operation’s content came from a large bot network. A review of interactions with these posts showed that all engagement occurred within a single minute shortly after the posts went online, which is a strong indicator of bot activity. The same apparent bot network also promotes cryptocurrency and pro-China posts. French government agency VIGINUM reported that Operation Overload appears to buy X accounts from a company called WebMasterMarket, demonstrating how state-aligned actors use commercial services including AI chat bots, PR agencies and hosting providers.
Operation Overload’s potential impact is not limited to engagement metrics. Outlets who had their logos appropriated and improperly placed on false reports are at risk of reputational damage and losing public trust. Impersonated law enforcement agencies may find it harder to raise awareness about future threats if their logos become linked with false information. Academics whose voices were manipulated to praise terrorists, criticise vaccines or attack politicians could face professional blowback or threats.
Beyond this, each Operation Overload post tagged researchers, media outlets and other institutions to distract them with additional debunking efforts. This has the potential to divert resources from addressing more pressing or credible online threats.
Conclusion: The incoming overload
Operation Overload continues to churn out a significant amount of content — misleading social media users, impersonating and undermining trust in reputable organisations, and distracting researchers. As recent trends show, the operation’s multilingual capability enables it to target emerging geopolitical controversies and events in various countries. Elections, high-profile politicians and Ukraine-related developments will almost certainly remain in its crosshairs.
This understanding of Operation Overload’s targets and tactics should enable platforms to better monitor and address its activity. Platforms where the operation posts, such as Telegram, X and Bluesky, should actively track and remove its content. Given operation Overload’s tendency to post identical video and images across various platforms, coordination between trust and safety teams would allow for faster identification and removal of content, reducing its visibility across the information space.
Researchers should also build on these findings to raise awareness around particularly impactful or malicious posts from the operation. Improving people’s understanding of recurring narratives and viral posts can help guard against manipulation. Additionally, organisations and individuals who have been impersonated by Operation Overload should be notified and given a chance to respond as they see fit. The likelihood of the operation causing harm grows as it continues sharing content. Additional monitoring is needed to track its evolution, assess its impact and enable timely responses.
End notes
[1] Operation Overload targeted Canada, Denmark, France, Germany, Italy, Poland, Spain, Ukraine, the United Kingdom, and the United States. It posted in Arabic, English, French, German, Indonesian, Japanese, Portuguese, Spanish, Turkish, and Ukrainian.