Misinformation Plagues TikTok Mental Health Content: Investigation Reveals
A recent investigation has uncovered a troubling trend on the popular social media platform TikTok: a significant portion of trending videos offering mental health advice are disseminating misinformation. The Guardian’s analysis of the top 100 trending videos under the hashtag #mentalhealthtips revealed that 52% contained questionable advice, unproven supplement recommendations, and dubious quick-fix solutions for complex mental health challenges. This alarming discovery raises serious concerns about the potential harm such misinformation can inflict on vulnerable individuals seeking support and guidance online.
The misinformation ranged from seemingly harmless yet ineffective tips, such as eating an orange in the shower to alleviate anxiety, to more concerning claims promoting unverified supplements like saffron, magnesium glycinate, and holy basil for anxiety reduction. Even more troubling were videos suggesting one-hour cures for severe trauma and misrepresenting normal emotional experiences as symptoms of serious mental illnesses like borderline personality disorder (BPD). Experts warn that such content can trivialize the experiences of those genuinely suffering from these conditions, while simultaneously misleading and potentially harming those seeking legitimate help.
Medical professionals have expressed grave concerns about the spread of this misinformation. David Okai, a consultant neuropsychiatrist at King’s College London, criticized the misuse of therapeutic language and the reliance on anecdotal evidence in many of these videos. He emphasized that personal experiences are not universally applicable and that such advice can be detrimental. Dan Poulter, a former health minister and NHS psychiatrist, echoed these concerns, pointing out that some videos pathologize everyday emotions, falsely equating them with serious mental illnesses. This can not only misinform viewers but also minimize the struggles of those living with diagnosed conditions.
Amber Johnston, a British Psychological Society-accredited psychologist, specifically reviewed videos related to trauma and expressed concern about the oversimplification and homogenization of PTSD experiences. She highlighted the highly individual nature of trauma and PTSD symptoms, which cannot be accurately captured in short, generalized videos. Johnston warned that these videos could make viewers feel worse by setting unrealistic expectations for recovery and creating a sense of failure when these "quick fixes" prove ineffective. The pervasiveness of this misinformation underscores the need for accurate, evidence-based information from qualified professionals in the online space.
Chi Onwurah, a Labour MP and chair of the Science, Innovation and Technology Committee, acknowledged the ongoing investigation into misinformation on social media and emphasized "significant concerns" regarding the effectiveness of the Online Safety Act in combating this issue. Onwurah specifically pointed to the role of content recommender systems, like those used by TikTok, in amplifying potentially harmful misinformation. The concern is that these algorithms can inadvertently promote misleading mental health advice, exposing a larger audience to inaccurate and potentially damaging information. The committee’s findings underscore the urgency of addressing shortcomings in the Online Safety Act to ensure better protection for the public’s online safety and health.
TikTok, in response to the investigation, stated that videos discouraging professional medical support or promoting dangerous treatments are removed. They also emphasized their efforts to direct UK users searching for mental health terms to official NHS guidance. A TikTok spokesperson defended the platform as a space for self-expression and community support, arguing that the study’s methodology unfairly restricts free expression and prevents individuals from sharing their personal experiences. They further highlighted TikTok’s collaboration with health experts at the World Health Organization and the NHS to promote reliable information and proactively remove harmful content. However, critics point to the lack of explicit mention of mental health disinformation in TikTok’s guidelines, which currently address suicide, self-harm, and disordered eating, but not the spread of misleading mental health advice. The government, in response, affirmed its commitment to tackling harmful mis- and disinformation online through the Online Safety Act, which mandates increased transparency from social media platforms regarding permissible content and measures to protect children from harmful material. The ongoing debate highlights the complex interplay between freedom of expression, platform responsibility, and the urgent need to safeguard users from potentially harmful misinformation, particularly in the sensitive realm of mental health.