by Elina Kuokkanen & Annika Niesen, Beyond the Horizon, Belgium.
Disinformation, misinformation, and manipulation are flooding the information space, overburdening our cognitive capacities, and creating barriers to democratic participation across the European Union. Young people are active users of social media, but don’t often acquire the skills to critically assess the content that they encounter. Developing media literacy and digital literacy skills, and critical thinking, is more important than ever. However, teachers are not facing an easy task – the quickly changing technological landscape brings emerging threats along, and the speed and intensity of disinformation make countering and correcting false narratives
challenging.
Inoculation theory attempts to provide a solution: pre-emptive protection against manipulation. The theory is based on the premise that exposure to small, controlled doses of misinformation in a safe environment builds resistance to it. Just as a vaccine introduces a weakened pathogen so the immune system can learn to recognise and fight it, structured classroom exercises can train students to identify manipulation techniques before they encounter them in real life.
This approach underpins the Immune 2 Infodemic project, an EU-funded initiative that has directly engaged over 2,250 learners across five Member States. The project has developed 31 practical tools that span the full range of competencies students need. Critical thinking tools help students recognise cognitive biases, map arguments, and question conspiracy narratives. Media literacy tools teach students how to distinguish fact from opinion, evaluate sources, and identify emotional manipulation. Digital literacy tools address everything from managing one’s data footprint to recognise algorithm-driven content bubbles and AI-generated material.
A teacher’s package for the classroom
To help teachers even more, we combined core elements and materials into a concise Teacher’s Package, specifically made for high school teachers. With these materials, we aim at a threefold impact: increasing knowledge, training skills, and creating attitudinal change. Theoretical parts equip teachers with the necessary knowledge on motives, tactics, and methods that fuel disinformation. The example lesson structure provides a frame for integrating media literacy learning into the curriculum. Cases from real life enable students to practice fact-checking in a safe, controlled environment, where they can receive feedback. Reflection questions and tips for digital hygiene help to open conversations on aspects that might worry students and help establish conscious and healthy media consumption habits. Games like “Bad News” put students in the role of someone spreading false content and help them understand the mindset of a disinformation creator. Games and fun exercises serve another goal as well: they help students by managing their fear. Disinformation works not just by deceiving the intellect but by triggering strong emotions, particularly “negative” emotions like fear, outrage, and disgust – worry and fear paralyse decision-making and impede our ability to think critically. Laughter and playful activities have the opposite impact – they make us more relaxed, less afraid, and set the tone for dealing with the topic without causing unwanted side effects like increased anxiety.
Real world cases
Our real-world cases are drawn from actual disinformation campaigns and let students apply their tools to content they might genuinely encounter online. The package includes selected cases from three themes that deserve particular attention in the classroom:
AI-enabled disinformation represents perhaps the most rapidly evolving threat. Generative AI now allows anyone to produce personalised, persuasive content — fake social media posts, cloned voices, deepfake videos, and responsive bot accounts — at a scale that was unimaginable just a few years ago. This presents a target-rich opportunity for those with malicious intent. The classic visual tells of AI imagery (distorted fingers, blurred backgrounds) that are already fading as the technology improves. Scalability is the core danger: what once required an entire propaganda operation can now be done in minutes, and very cheaply.
Foreign Information Manipulation and Interference (FIMI) adds a geopolitical dimension. These are coordinated campaigns, often state-sponsored, designed not merely to deceive but to polarise — to fracture social trust and weaken democratic participation. Increasingly, these campaigns rely on bots and AI-generated content to spread disinformation at a speed and on a scale that no human network could ever achieve. Another challenge is “pollution” of LLM’s with disinformation narratives – when malicious actors purposefully fill the internet with false narratives, AI that collects data starts repeating these messages, causing polarisation and misleading their users.
Thirdly, Climate disinformation includes sophisticated and diverse narratives aimed at sowing doubt and decreasing trust in science, with potentially catastrophic consequences. “Climate delay” narratives — the suggestion that solutions won’t work, that clean energy is unreliable, overly optimistic that technology will solve everything at the end or claims that responsibility lies solely with individual consumers — are often more subtle than outright denial. They don’t reject science; they sow just enough doubt to prevent coordinated action.
Ultimately, teaching students to resist disinformation is teaching them to participate in democracy. A student who can evaluate a source, recognise manipulation, and maintain healthy scepticism without collapsing into cynicism is a more capable citizen — less susceptible to polarisation, more willing to engage in informed debate, and better equipped to protect others in their communities.
Authors
Annika Niesen, Project Assistant, Beyond the Horizon ISSG vzw
Elina Kuokkanen, Project Manager at Beyond the Horizon, is responsible for coordinating the European project IMMUNE 2 INFODEMIC 2 under CERV funds.



