DE

Desinformation
Tiktok, X & Co. – Media Usage and Susceptibility to Disinformation

Junge Menschen und Nutzer von TikTok sind für Desinformationsnarrative besonders anfällig.

Junge Menschen und Nutzer von TikTok sind für Desinformationsnarrative besonders anfällig.

© picture alliance / ROBIN UTRECHT | Robin Utrecht

Europe’s information sphere increasingly runs through TikTok, X, YouTube, and a handful of other platforms that were neither designed nor secured for democratic resilience. A brief U.S. ban on TikTok in January this year reignited the argument over national security and foreign influence on the information sphere, but the more urgent story is closer to home: in Germany, researchers have shown that recommendation engines can nudge new users toward far-right content, while X has drawn criticism for Elon Musk’s open endorsement of the AfD. It has long been suspected that social media shapes what users see and think. What remains unclear is how, and to what extent, algorithms systematically shift beliefs, particularly around disinformation.

A nationwide survey in Germany at the beginning of this year, commissioned by the Friedrich Naumann Foundation for Freedom, tries to illuminate that gap. Users of social media but especially of TikTok, X, and YouTube, are more likely to accept false or misleading narratives. Among German TikTok users, worryingly large shares endorse propaganda-friendly positions: e.g., that Russia is fighting a “fascist regime” in Ukraine, or that China is not a dictatorship. Forty‑two percent of TikTok users surveyed say authoritarian systems such as China’s are more effective than democracy.

There is also a stark generational pattern visible in the data. Nearly two-thirds of respondents over the age of 60 fully agree that China is a dictatorship. Among 16–30‑year‑olds, only about a third do, and roughly one in five do not see China as a dictatorship at all. Media consumption correlates strongly with these views. Just 28 percent of TikTok users fully agree that China is a dictatorship, compared with about half of the overall sample. The same split appears on Russia. A strong majority recognizes the invasion of Ukraine as an illegal war of aggression, and more than half support Western assistance. Yet almost one in five still believes Moscow has a greater interest in peace than the West, with higher agreement among people aged 16-44.

In public debate, Russia and China are frequently cited as the main drivers of coordinated disinformation. The survey tested whether citizens share that view. Seventy percent of respondents identify Russia as the leading source of false information, though a full third do not. China follows at 59.2%, then North Korea (57%), Iran (45.7%), Turkey (41%), the United States (37.9%), Israel (32.9%), and Germany (22.7%). Media consumption also matters. TikTok users are far less likely than average to see China as a source of disinformation (41.4% vs. 59.2%), while users of public broadcasters (68.9%), local newspapers (68.6%), and national newspapers (67.7%) are markedly more skeptical of Beijing. TikTok users are not generally less concerned about disinformation, they just locate it differently: they are more likely than the broader public to suspect Germany itself (34.3% vs. 22.7%) and less likely to suspect China or Russia (50.2%). Perception gaps show up elsewhere, too: 31.3% of X users say they’ve noticed disinformation in public broadcasting, vs. 18.4% overall. Together, these patterns underline how age and information sources shape not only what people believe, but whom they blame.

The survey also shows that many do not feel equipped to face disinformation. More than half of respondents see disinformation as a serious problem. Yet about half say they struggle to recognize false narratives. Younger people rate themselves more capable of spotting manipulation, but the same group shows signs of greater exposure and, in places, greater vulnerability and susceptibility. It is tempting to reach for simple fixes, but after decades of research it has become clear that there is no simple answer or solution. Meta-analyses repeatedly show that what works in one context underperforms in another; even fact-checking, among the more studied tools, shows uneven effects depending on language, topic, and audience. A central obstacle remains access to data: researchers still cannot reliably study how recommendation systems rank, amplify, and shape the civic information consumption. Europe’s Digital Services Act (DSA) was designed to change that, yet legal battles over data access and auditing show how fiercely some platforms resist transparency. In recent months, there has further come a push from the U.S. administration against European legal frameworks, such as the DSA, making their enforcement harder.

This is not a neutral fight. The platform layer is governed by business models, leadership choices, and incentives that leave fingerprints on public discourse. Under its current ownership, X has escalated its confrontation with EU rules such as the DSA and Europe’s data protection regulation, the GDPR. Autocracies have become adept at gaming engagement-driven systems to seed and spread their narratives. When a meaningful part of younger users begins to view authoritarian governance as “more effective,” we are not just debating content moderation. We are confronting a fundamental erosion of trust in democratic institutions. Europe has the tools, and it should use them.

Disinformation is both a symptom and an accelerant of deeper geopolitical and social shifts. Europe cannot outsource the integrity of its information space to private companies—or to the autocracies that have learned to exploit them. If we want democracies that can survive the stress test, we will need accountable platforms, enforceable laws, capable research, adaptive interventions, and citizens willing to be custodians of a shared reality.