The conflict involving the United States, Israel and Iran has triggered a wave of disinformation around the world, including in Poland, according to experts who say such content has become part of modern information warfare.
In Poland, the false narratives have included antisemitic themes, anti-Iranian messages, and claims designed to increase public anxiety.
Agnieszka Lipińska, head of NASK’s Disinformation Analysis Center, said those behind the misleading content are trying to raise fears that the fighting could spread, with some posts having gone as far as claiming that World War III was already underway, or suggesting that Poland’s government had been caught off guard and did not know about the planned attacks.
NASK is a state research institute focused on cybersecurity and digital security. On its website, NASK outlines the aims and the dangers of spreading disinformation which consists of false or misleading content that can appear in many forms, including social media posts, articles, comments videos, and audio recordings.
It is spread deliberately in order to cause public harm or generate profit, with broader aims such as creating fear, eroding trust in institutions, and deepening social divisions.
Ordinary internet users can become part of this process simply by sharing false material online. Over time, disinformation can weaken confidence in public institutions, science, and the media, while contributing to social polarization and broader state instability.
Agnieszka Skruczaj-Olejnik, a statistician and psychologist from SWPS University in Warsaw, said disinformation is part of the strategy of conflict itself because it shapes how societies understand events and can influence political and social decisions.
Among the false stories circulating in Poland was a claim that Jews were being evacuated to Poland after the outbreak of fighting in the Middle East.
Lipińska said the narrative draws on antisemitic ideas and conspiracy theories, including one about a supposed “Heavenly Jerusalem” to be created in Poland.
Another misleading claim suggested that Poland had shown solidarity with Israel by lighting up Warsaw’s Palace of Culture and Science on February 28.
The building was indeed illuminated that day in blue, pink and green, but the display had nothing to do with the conflict. It marked Rare Disease Day, an annual medical awareness campaign focused on patients with rare illnesses and their families.
Lipińska said Polish social media users have also been exposed to propaganda and false reports about the fighting itself.
She pointed to content suggesting panic in Israel after Iranian attacks and to posts claiming that American facilities had been destroyed, including an alleged CIA headquarters in Qatar.
She said the identified messages fit into broader anti-Western, anti-American, antisemitic and conspiracy-based narratives.
Their purpose, she added, is to polarize public opinion and weaken trust in state institutions. The biggest reach for such content in Poland was recorded between February 28 and March 3.
One of the earliest global falsehoods used footage from the realistic military video game War Thunder.
The clip was presented online as evidence that Iran had shot down a US F-15 fighter jet in Kuwait. The game’s creators later said they did not support the use of in-game footage to spread false information.
According to British media reports, one fake video posted on Instagram was viewed nearly 79 million times.
A different kind of manipulation involved artificial intelligence. Malachy Browne of The New York Times described a case in which Grok, the AI chatbot on the social platform X, wrongly labeled a real image shared by Iranian authorities as fake.
The photograph showed graves prepared for victims of a missile strike on a girls’ elementary school in the city of Minab, where more than 170 people were killed, most of them students. The chatbot incorrectly told users that the image came from Indonesia in 2021 during the COVID-19 pandemic.
Skruczaj-Olejnik said disinformation works through three main psychological mechanisms.
First, it plays on emotion, making internet users more impulsive and less likely to examine what they see carefully.
Second, repeated exposure can blur people’s memory of where a claim came from, making it seem more credible simply because it appears often.
Third, this creates cognitive uncertainty, a state in which people are no longer sure what is true and begin to assume that widely shared content must be reliable.
She said modern technology has made the problem easier to scale. A single troll with multiple social media accounts and access to artificial intelligence tools can quickly generate many versions of the same false narrative.
Repeated often enough, she said, such content can start to feel true, especially when fear is already weakening people’s judgment.
The NASK website further outlines the building blocks of disinformation campaigns.
Common techniques used in disinformation include clickbait, which relies on sensational or highly emotional headlines designed to grab attention, and emotionally loaded language meant to provoke fear, outrage, anger or resentment.
Disinformation can also involve impersonation, using someone else’s image or authority to spread harmful content, as well as cherry-picking, where selected facts are presented in a way that supports a claim while leaving out important context.
Another method is the false-cause fallacy, which wrongly assumes a cause-and-effect link between events, while anecdotal evidence uses personal experience to cast doubt on statistical data or research findings.
More advanced forms include deepfakes, which are fake audio or video materials generated by artificial intelligence, and cheapfakes, which manipulate audiovisual content using simple, widely available tools.
(rt/gs)
Source: PAP