The Evolving Threat of Online Disinformation: How AI is Amplifying the Risks for Children
In an exclusive interview with LBC, Katherine Howard, Head of Education and Wellbeing at Smoothwall by Qoria, a leading provider of online safeguarding solutions, highlighted the alarming evolution of disinformation and its impact on children’s online safety. No longer confined to isolated clickbait articles, misinformation has morphed into complex online ecosystems, amplified by the pervasive power of artificial intelligence. These ecosystems consist of interconnected content designed to exploit trust and manipulate users, ranging from health disinformation and conspiracy theories to harmful myths surrounding self-harm, abuse, and grooming.
Howard emphasized the targeted nature of this misinformation, with children increasingly exposed to tailored content, including viral challenges and conspiracy-style videos that promote distrust in adults. These emotionally charged and rapidly spreading narratives, often disguised as rebellious acts, create a wedge between children and trusted figures like parents and teachers. The fast-paced and emotionally driven nature of these narratives makes them particularly appealing to children, while simultaneously undermining the credibility of adult guidance.
While today’s children are often adept at navigating digital platforms, Howard stressed the critical distinction between digital fluency and digital literacy. Simply knowing how to use a platform doesn’t equate to understanding the underlying mechanisms and potential manipulative tactics employed online. Young people, she explained, often struggle to differentiate between credible sources and manipulative content, especially when it’s integrated into popular meme culture or presented as advice from influential online personalities. This lack of critical evaluation leaves them vulnerable to the deceptive nature of carefully crafted misinformation campaigns.
Howard argued that digital literacy is no longer a supplementary skill but an essential life skill for navigating the modern digital landscape. It equips children with the ability to recognize when algorithms are influencing their feeds or when content is deliberately designed to manipulate their emotions. Importantly, digital literacy instills a sense of control over their digital experiences, empowering them to resist manipulation and make informed choices online. This empowerment is a crucial step towards building resilience against the pervasive influence of online misinformation.
The insidious nature of disinformation, Howard explained, extends to online grooming, where manipulators exploit children’s search for community and belonging. Misinformation becomes believable, she noted, when it originates from someone a child admires or trusts, whether a friend met online or a followed influencer. Children experiencing low self-esteem or feelings of isolation are particularly susceptible to these tactics, seeking validation and acceptance in online communities that may harbor malicious intent.
Addressing the complex challenge of combating online disinformation, Howard advocated for a shared responsibility approach involving schools, parents, tech platforms, and government. No single entity can effectively tackle this crisis alone. A consistent safety net, spanning both online and offline worlds, is crucial for protecting young people. This collaborative approach ensures that children are equipped with the tools and support they need to navigate the digital landscape safely and responsibly.
Howard urged parents to maintain open communication with their children, resisting the urge to shut down conversations when children repeat misinformation. Instead, she suggested engaging in dialogue by asking questions like, “Where did you hear that?” or “How do you know that’s true?”. This approach fosters critical thinking and encourages children to question the validity of information they encounter online. It builds trust and opens avenues for discussion rather than driving children towards secrecy.
Smoothwall by Qoria plays a vital role in this effort by providing safeguarding tools to more than a third of UK schools, including real-time monitoring that alerts staff when students are exposed to harmful or misleading content. This proactive approach helps to identify and mitigate the risks associated with online misinformation and harmful content.
Looking towards the future, Howard identified digital literacy as the single most important skill children need by 2025, but emphasized the importance of a supportive environment built on trust. When children feel comfortable confiding in parents or teachers without fear of judgment, harmful situations can be addressed early. Cultivating this culture of trust is paramount in safeguarding children from the pervasive threats of online misinformation and abuse. This, combined with comprehensive digital literacy education, forms the foundation for a safer and more empowering online experience for future generations.