Artur Urbaniak of Adam Mickiewicz University in Poznań, western Poland, said young people are being pulled in two directions at once: many can spot falsehoods, but they also doubt information that is true.
Urbaniak said his findings point to a practical lesson that is easy to overlook in a world of slick online content: knowledge still matters.
“The more developed someone’s knowledge is, the harder it is to manipulate them,” he said.
“What we observe is rather a crisis of trust,” he told Poland's PAP news agency, adding that respondents rarely declared certainty.
Instead of saying a claim was “certainly false,” he said they tended to choose answers like “it may not be true” or “it’s hard to say.”
Urbaniak, who works at the university’s Institute of Applied Linguistics, led an international pilot project testing how language can make fabricated stories feel credible.
The study involved 47 participants in Poland and 47 in the United States. Respondents watched short “world news” items delivered by an artificial intelligence-generated avatar, with the content invented by the researchers.
After each clip, participants rated credibility on a five-point Likert scale, a common survey tool that measures opinions from strong belief to strong disbelief, and then discussed what shaped their judgments.
One test item claimed Rome’s Colosseum would be rebuilt into a modern commercial and sports complex, with a roof and glass-and-steel walls.
Among Polish respondents, 66 percent rated the claim “definitely false” and 19 percent “probably false.”
In the US group, 37 percent judged it “probably true” and 9 percent “definitely true.”
Urbaniak said the gap points to a basic vulnerability: people without background knowledge, or cultural intuition, can be easier to mislead.
“If audiences lack hard knowledge, or even intuition that comes from cultural context, it is easier to make a mistake,” he said.
He argued that deception often works less through a single dramatic lie and more through “truth markers,” details that mimic reliable reporting, such as specific numbers, technical language, and references to real institutions.
He described this as boosting credibility through the perceived authority of an expert body, scientist, doctor, or public figure.
In misinformation research, “misinformation” usually means false information shared without intent to deceive, while “disinformation” refers to falsehoods spread deliberately.
A broader body of research is arriving at similar conclusions about young audiences.
A 2025 scoping review in the journal Computers in Human Behavior, which synthesized over 150 studies of young people aged five to 25, found that many young people overestimate their ability to spot false information, often struggle to detect it in practice, and frequently respond passively by ignoring it rather than challenging it.
The review also linked exposure to false information with confusion and fear, and with wider civic effects including lower political participation, and it argued that prevention programs should strengthen detection skills, encourage active countering, and address disparities tied to social background and peer dynamics.
The same theme appears in a recent Nature Human Behaviour perspective, published in November 2025, that calls for targeted, evidence-based interventions for adolescents, described as among social media’s most avid users.
The authors argue that research has focused mainly on adults even though adolescents face different kinds of content and go through rapid social, emotional, and cognitive changes.
Those changes, they write, can increase vulnerability through social influence, emotional manipulation, and cognitive biases, while also creating opportunities to build resilience.
The article proposes a developmental research agenda and introduces a framework for understanding how beliefs update in social media environments, in essence updating what we believe when we get new evidence.
That perspective was co-authored by developmental psychologist Ili Ma, who in 2024 received a EUR 1.5 million European Research Council grant at Leiden University to build a research program on youth susceptibility to misinformation and disinformation.
Ma said teenagers are “growing up in a digital world where misinformation and disinformation are becoming bigger and bigger problems,” and she plans to combine experiments with qualitative work to identify which lies and fallacies are most persuasive to different age groups.
She has also said the project will involve close cooperation with young people, parents, and professionals, including digital neighborhood officers and counterterrorism experts, to keep the research relevant to real-world risks.
The Polish pilot project was funded under a grant project run by Poland’s National Science Centre (NCN).
Urbaniak worked with an international research team that included scholars from the United States, Canada, Ireland, Slovakia and Vietnam.
(rt/gs)
Source: naukawpolsce.pl, scienceinpoland.pl