Russian influence operations pose a direct threat to truth and security in Ukraine and across Europe. The new report titled Attributing Russian Information Influence Operations produced by the Ukrainian Centre for Strategic Communications and the NATO Strategic Communications Centre of Excellence, shows how the Information Influence Attribution Framework is applied to identify responsibility and patterns of manipulation.
The aim of the report is to test and refine the Information Influence Attribution Framework (IIAF) by applying it to real-world Russian campaigns. This is done in a context where EU sanctions on Russian state media, the Foreign Information Manipulation and Interference (FIMI) policy framework, and the Digital Services Act (DSA) are raising evidential standards for attribution.
How IIAF works
According to the NATO StratCom COE and Hybrid COE framework (2022), ’attribution’ can be conceptualized through three types of evidence: technical, behavioural, and contextual, complemented by a legal and ethical assessment. Technical evidence refers to traces left by illicit activities, such as IP addresses or digital signals. Behavioural evidence examines manipulative activities and methods, including Tactics, Techniques, and Procedures (TTPs). Contextual evidence considers the content and political dimensions, such as messaging and narratives. Finally, the legal and ethical assessment evaluates proportionality, data protection, and geopolitical considerations related to the use of each type of evidence.
Each category of evidence can draw on different sources: open sources (e.g. research, open APIs, OSINT), proprietary sources (e.g. social media backends, private-sector intelligence), and classified intelligence (e.g. SIGINT, HUMINT). Analysts rarely have access to all sources and in public attributions, open-source and proprietary data are typically dominant.
The Information Influence Attribution Framework (IIAF) serves two main purposes. First of all, it demonstrates that attribution is the result of multiple assessments combined to form a credible picture. Some evidence categories may be strong, while others are minimal. Even when technical evidence is available from both open and classified sources, legal and ethical considerations may prevent attribution, for instance, to protect sensitive intelligence.
You might be interested
Second, the framework provides a way to communicate and share high-level, non-specific information about the factors underpinning an attribution. It can indicate that open-source contextual data was the primary basis for a decision, providing stakeholders with transparency and nuance about the attribution process.
From fake recruitment posters in Warsaw…
In the report, there are many real cases shown and the DISARM framework can be applied to analyze Russian information influence operations. In particular, among them, there is one very interesting reagarding the developments of Russian disinformation.
During the early stages of Russia’s full-scale invasion of Ukraine, Kremlin-linked Telegram accounts pushed a false narrative claiming that Poland planned to annex parts of Ukraine. The operation combined forged documents, impersonation of credible sources, and targeted amplification to spread misleading content across platforms and languages. Pro-Kremlin Telegram channels circulated fake images claiming Polish military recruitment posters were appearing in Warsaw Metro stations, urging citizens to ’protect ancestral Polish lands’ and ’become a Leopard tank operator’.
Kremlin-linked Telegram accounts pushed a false narrative claiming that Poland planned to annex parts of Ukraine. – Case study cited in the report Attributing Russian Information Influence Operations
The Russian channel Signal published doctored photos of billboards depicting General Jarosław Mika alongside the phrase “It’s time to remember history,” referencing historical Polish claims over western Ukraine. The channel also cited flag removals and statements by Russian intelligence chief Sergei Naryshkin, falsely claiming Poland intended to invade Ukraine. The story was amplified by Kremlin-tied channels such as Gossip Girl and Legitimniy, which reframed context to portray Poland as censoring Ukrainian history.
…to manipulated President’s speech
On May 3 of that year, Telegram channel Rokot|Ryk posted a video of Polish President Andrzej Duda’s speech about Ukraine-Poland cooperation. Taken out of context, the video falsely suggested Duda supported annexation of Ukrainian territory. Pro-Kremlin channels propagated the claim that Ukraine would be renamed ’Ukropol’. After two days, a video with Russian subtitles circulated on Zheltye Slivi and was shared by Signal and the Ukrainian Kremlin-linked channel ZeRada, alleging Poland’s imperial ambitions and citing forged billboards as ’evidence’. But already on May 4, a manipulated video using the BBC News logo repeated the claim that Poland planned a military incursion, including forged orders signed by General Mika and fabricated imagery of troops and helicopters in northern Poland.
Results? The video falsely stated that Washington endorsed Poland’s actions and that NATO would stand aside. It was disseminated on Twitter, Telegram, and Facebook in multiple languages, including Russian, French, Italian, Turkish, and Czech.
Detecting disinformation as a political task
Attributing Russian information influence operations is a complex task, but it can be done with credibility when multiple types of evidence. These include technical signals, behavioural patterns, and contextual analysis, often combined. Tools like the DISARM framework help identify the tactics behind these campaigns. Precise language—distinguishing ’state-shaped’ from ’state-directed’ operations—makes attributions clearer and more accountable.
Attribution is as much a political act as an analytical one. It requires transparency, collaboration between governments, platforms, and civil society, and support for independent investigators.
By applying these methods consistently, society gains the ability to uncover who is behind disinformation. It also has the power to respond effectively, protecting public trust and the integrity of information in an age of constant manipulation.