Bangladeshi Youth Overwhelmed by "Fake News" on Social Media, UNICEF Poll Reveals
A recent UNICEF poll has shed light on the significant mental health challenges faced by young people in Bangladesh due to the pervasive presence of misinformation and harmful content on social media. The anonymous poll, conducted through UNICEF’s U-Report platform, surveyed nearly 29,000 children and young people across Bangladesh, revealing a startling statistic: two-thirds of respondents identified "too much fake news and misinformation" as the primary source of stress related to social media use. This finding underscores the urgent need for effective strategies to combat the spread of misinformation and create safer online environments for young people.
Beyond the overwhelming concern about fake news, other significant stressors emerged from the poll. Bullying and negative comments were cited by one-seventh of respondents as their most stressful experience on social media, with girls experiencing this slightly more frequently than boys. Additionally, a similar proportion of respondents pointed to the exposure to harmful or upsetting content as a major contributor to their online stress. These findings highlight the multifaceted nature of the challenges faced by young people in the digital sphere and the importance of addressing these issues comprehensively.
The poll also explored young people’s perspectives on content regulation within social media platforms. While a minority (23%) expressed concern that rules might stifle free expression, a significantly larger proportion (52%) believed that regulations are essential to curb harmful behaviors such as bullying and hate speech. This suggests a growing recognition among young people of the need for a balance between freedom of expression and the protection of online safety.
The potential consequences of lax content moderation were also a significant concern for respondents. A staggering 79% believed that relaxing rules would harm vulnerable groups within their community, with ethnic or religious minorities (30%), children and youth (26%), and women and girls (23%) identified as the most likely targets. This finding underscores the heightened vulnerability of these groups to online harassment and discrimination and reinforces the importance of robust content moderation policies to protect them.
UNICEF Representative in Bangladesh, Rana Flowers, emphasized the real-world impact of online misinformation and hate speech, stating that these phenomena can fuel physical and mental harm, particularly for children already facing discrimination. Flowers further noted that while young people recognize the positive opportunities offered by digital spaces for connection, learning, and debate, they are also increasingly wary of the risks posed by unregulated online environments. This highlights the need for collective action to create safer digital spaces where young people can engage freely without fear of harm.
The poll also revealed a significant shift in the type of content young people are encountering online. Over half of the respondents reported noticing changes in the content they see on social media, with 17% feeling less safe online as a result. This underscores the dynamic nature of the online landscape and the need for ongoing monitoring and adaptation of safety measures. UNICEF stressed the shared responsibility of policymakers, regulators, tech companies, educators, parents, and young people themselves to ensure children can access accurate information, distinguish truth from misinformation, and navigate digital spaces safely. UNICEF called for urgent action to develop moderation systems and policies that protect children’s rights, foster safe and inclusive online environments, and provide digital literacy education to equip young people with the skills necessary to navigate the digital world responsibly. This multifaceted approach is crucial to mitigating the negative impacts of online misinformation and creating a safer and more empowering digital experience for young people in Bangladesh.