The objective? Drive a wedge between allies, sow distrust and undermine the logic of support—all wrapped in the style of a low-budget Telegram thriller.
Take, for example, a video circulating online: supposedly, Ukraine’s security service (SBU) arrests two Poles for displaying a poster featuring the Polish anthem and flag.
The setting looks like a parody—an apartment resembling a communist-era community hall. The "actors" speak no Polish, don't behave like Ukrainians, and the SBU logos seem crudely pasted on in MS Paint.
Meanwhile, the television in the background is tuned to Russia's state-run channel Zvezda. It's hard to imagine a more absurd piece of "news."
Experts have no doubt: this is a textbook Russian fake—rushed production, a strong emotional trigger, no attention to detail.
The aim isn't quality. It’s impact. As for the truth? That depends on who's watching.
"Sow discord among his generals. Make him unsure of what is real and what is not." No one knows for sure who first wrote these words of ancient wisdom about disinformation. Some suggest they trace back to The Art of War by Sun Tzu. Regardless of origin, disinformation is nothing new—and it's constantly evolving. Image: Polskie Radio
The recipe for these operations is as simple as it is effective: pick a sensitive topic and wrap it in patriotic rhetoric. Stage a scene with random actors. Upload the video to a pro-Russian Telegram channel and wait for it to spread across social media.
Add a few comments about "oppression of Poles" or "Ukrainian ingratitude," and watch as a wave of outrage ripples outward.
'If everything is suspicious, then nothing is certain'
Crucially, these fakes are often deliberately crude. As an old Petersburg saying goes: "If everything is suspicious, then nothing is certain."
The more something looks like a provocation, the more likely people are to question their instincts—and this weakens public resistance. "Maybe it did happen?" begins to creep into the minds of those who only watched 30 seconds.
And that’s not the only case in recent days. Another fake claimed that Ukraine had halted exhumations of victims of the Volhynia massacre.
Shared mostly in pro-Russian groups on Facebook and X, it alleged that Ukrainian authorities had unilaterally stopped work at a site in Puźniki.
Russian disinformation agents even forged a document from Ukraine’s culture ministry. In truth—as Poland’s Ministry of Culture confirmed—no such work was ever halted. But by the time the correction appeared, the lie had already reached thousands.
And that’s the point. The goal of Russian strategy isn't to create eternal truths—it's to spark anger, fear and mistrust. Then the lie can be discarded. The next one is already queued up.
Lies spread faster than corrections
Today, the Kremlin favours quantity over precision. The average fake news item costs about as much as a pizza. A few actors, some props, a smartphone camera—that’s all it takes.
Compared to high-budget operations like the MH17 cover-up or the Skripal poisoning narrative, today's disinformation efforts resemble amateur theatre.
And yet, they work. Why? Because most people don’t verify sources. Because emotions are faster than facts. Because no one has time to check context when everything is filtered through posts, reels and memes. Because lies spread faster than corrections. And because the damage is amplified when public figures share fakes.
Anatomy of a Russian provocation: 9 steps to chaos
In March, the US-based agency NewsGuard released an analysis showing how artificial intelligence is now being used not just to generate isolated fake news stories, but to orchestrate entire disinformation strategies. Things are getting serious.
Let’s try, for a moment, to step into the shoes of a Russian disinformation officer. This will give us a look at his "toolkit." Because even though AI is playing a growing role, at the heart of every operation is still a human being.
So how might a disinformation campaign titled "Ukrainian Security Service Arrests Poles for Displaying National Symbols" come together?
Step 1: Choose the battleground
The officer starts by identifying a flashpoint—ideally, something that ties together history, emotion, and current tensions. Exhumations in Volhynia? Perfect. The Polish anthem being played in Kyiv? Even better. Unfortunately for us, the Russian state has spent centuries studying the mental maps of both Poles and Ukrainians—and every other nation it has fought, antagonized or plans to provoke.
Step 2: Define the target audience
Here, too, the officer’s job is easy. He zeroes in on those most susceptible: nationalists, anti-Ukrainian voices, those sympathetic to Russia, or simply people weary of the war.
Step 3: Craft the storyline
Budget matters. If the officer needs to be frugal, the scenario will be crude and simplistic: a proud Pole displays his flag; Ukrainians react violently. Toss in a national anthem, a flag, or another clear symbol to make sure no one misses the message—and it’s ready to go.
Step 4: Produce the content—cheap and grainy on purpose
No need for high production value. In fact, low quality makes the video feel more "authentic" to the naïve viewer and provides plausible deniability.
"If this were really a Russian op," skeptics might say, "it would’ve been better produced."
Step 5: Seed the story
The fake video is first posted to a Russia-linked Telegram channel. From there, it spreads to Facebook, X, and internet forums. Bots step in to post the initial comments—often cautious or skeptical in tone. "Is this real?" "Can someone verify?" This strategic ambiguity lends the post a veneer of legitimacy.
Step 6: Fan the flames
Soon, real users pick up where bots left off. The comments shift from uncertain to outraged: betrayal, humiliation, lack of gratitude. Likes and shares snowball. Social media algorithms—always eager to amplify outrage—do the rest.
Step 7: Bring in the influencers
At this point, the officer needs someone with reach—an ex-politician, a columnist, a known provocateur. Ideally, someone with an established audience and the credibility to "notice" the story. To help, the post tags the person or a fringe outlet hungry for attention.
Step 8: Monitor and tweak
Now the officer watches the reaction. Is the story catching on? If not, he updates the script, records a new version, changes the subtitles. A fake lives for about 48 hours before being replaced by a "new angle"—a fresh clip that appears to confirm the first, but from a "different viewpoint."
Step 9: Pull back—and muddy the waters
When fact-checkers and institutions start to debunk the story, it’s time to retreat. The officer shifts the narrative: "It was your provocation." "Probably Ukrainian—those guys are always stirring trouble." "We had nothing to do with it."
As the lie falls apart, the aim is to blur, not clarify. Real political events—say, "The Polish PM had to ride in the second train car to Kyiv"—are mixed in to reinforce the fake's emotional logic. Chaos is the end goal.
Now, the disinformation officer can pour himself a drink.
How not to get fooled
The more difficult the real-world political situation becomes, the more likely we are to see an uptick in smear campaigns, fake arrests, staged provocations and forged documents. All of it will be wrapped in narratives like "Poland is fed up with ungrateful Ukraine." These stories may grow more absurd, but they’ll also be increasingly targeted at our emotional pressure points.
The responsibility for resisting this doesn’t lie solely with the media—it falls on all of us. We need to build up our immunity. Learn the patterns. Keep our distance. Think critically. And above all, don’t share every shocking headline just because it “seems real.”
Each of us has tools at our fingertips to at least gauge the credibility of a "news" item, if not verify it outright—browser searches, AI-driven fact-checkers, and more. Pay attention to timestamps. If something’s been online for days but hasn’t made it to mainstream news, ask yourself why. How did this "blockbuster story" escape wider notice?
Conversely, if the story is only a few hours old, consider flagging it and waiting for verification from reliable outlets. Don't try to win the race to be first—because you might just be helping a hostile state use you as a pawn. Why let that happen?
It’s also crucial to check where the story first appeared. Russia’s disinformation machine manufactures fake news sites and blogs—a nesting doll of phony sources. Be cautious.
Pay attention to the tone and format of the content. If something feels overly emotional or pushes a single, clear-cut conclusion—alarm bells should go off.
Language matters, too. Watch out for unnatural phrasing—things like nationality names written in lowercase, awkward sentence structure, incorrect prepositions, or misused adjectives. These are often signs of machine translation or non-native speakers.
Poland and Ukraine are ideal targets for the Kremlin: a shared history full of painful chapters, strong national emotions, and a war raging just across the border. Russia knows exactly where to strike.
Today, the war isn’t just being fought on Ukrainian soil. It’s also being fought in our minds. And it’s there that every victory—or every defeat—begins.
Sławomir Sieradzki
The author is a senior analyst at public broadcaster Polish Radio.