The Perils of Unregulated Social Media: Insiders Expose the Dark Side of Platforms
A growing chorus of former social media employees, once tasked with user safety, are sounding the alarm about the unchecked dangers festering on platforms like X (formerly Twitter), Instagram, TikTok, and Facebook. These whistleblowers allege that inadequate safeguards, coupled with corporate reluctance to address criticism, are exposing users, particularly young people, to disinformation, threats to democracy, and extreme content. Their testimonies paint a disturbing picture of platforms prioritizing profit over the well-being of their users, leading to devastating consequences.
Instagram’s Inaction: A Father’s Plea Ignored
Arturo Béjar, a former Meta product safety engineer, returned to the company in 2019 driven by personal concerns about his teenage daughter’s experience with unwanted sexual advances on Instagram. He was dismayed to find that previously implemented safety measures had been dismantled. Despite sharing alarming internal research with Mark Zuckerberg, highlighting the high prevalence of harmful experiences among young users, including unwanted sexual advances and exposure to self-harm content, Béjar’s warnings were ignored. He also presented his concerns and potential solutions to Instagram head Adam Mosseri, who acknowledged the issues but failed to implement the suggested changes. While Meta claims to have taken steps to protect teens, Béjar argues these are insufficient and that the company prioritizes reputation management over user safety. He asserts that Meta possesses the capability to enact meaningful change but lacks the will to prioritize it.
A Grieving Mother’s Fight for Accountability
Lori Schott believes her daughter, Anna, would be alive today if Meta had taken stronger action to safeguard young users. Anna died by suicide in 2020 after becoming addicted to social media, particularly Instagram and TikTok. Schott contends that the platforms, aware of Anna’s vulnerability to anxiety and depression, algorithmically fed her harmful content that exacerbated her mental health struggles. She is now among the numerous parents suing Meta, citing internal documents revealing that Zuckerberg was warned about the dangers of Instagram, yet dismissed proposed changes as "paternalistic." Schott holds Zuckerberg directly responsible for her daughter’s death and demands accountability for the tragic consequences of Meta’s inaction.
X: From Platform to Disinformation Machine
Eddie Perez, former head of Twitter’s disinformation and extremism team, describes the platform’s descent under Elon Musk’s ownership. He argues that Musk has dismantled the safeguards previously in place and transformed X into a “disinformation machine,” amplifying harmful narratives and conspiracy theories. Perez cites Musk’s own posts, which would have violated previous platform rules, as evidence of this shift. He highlights examples such as Musk’s suggestions of inevitable civil war, his musings about the lack of assassination attempts against US political figures, and his amplification of false claims regarding Haitian migrants. Perez believes that X is having a profoundly negative impact on the media ecosystem and warns that its influence on culture and politics should not be underestimated.
Erosion of Safety Measures Across Platforms
Frances Haugen, another Facebook whistleblower, argues that Musk’s drastic workforce reductions at Twitter emboldened Zuckerberg to follow suit, decimating safety teams across Meta’s platforms. Haugen, who exposed internal documents revealing Facebook’s awareness of its harmful effects, believes the platform has become even more dangerous since her revelations. She points to the closure of CrowdTangle, a transparency tool, as evidence of Meta’s increasing opacity. Zvika Krieger, a former Facebook safety specialist, echoes these concerns. Despite years of experience working to mitigate harmful content, Krieger wouldn’t allow his own children to use social media platforms, highlighting the severity of the risks. He criticizes companies for prioritizing user engagement and revenue over implementing safety measures, which are often viewed as impediments to the user experience.
TikTok’s Algorithm: A Gateway to Hate and Harm
Lori Schott also blames TikTok for her daughter Anna’s death, citing the platform’s algorithm for feeding Anna videos related to death and suicide. The content, seemingly tailored to exploit Anna’s vulnerabilities, reinforced her suicidal ideations. Andrew Kaung, a former TikTok planning analyst, corroborates the dangers posed by the platform’s algorithm. He describes witnessing horrific content, including violence, animal abuse, and child sexual abuse material, and expresses concern about the platform’s targeting of young users. Kaung argues that self-regulation by social media companies is inherently flawed, comparing it to "asking a tiger not to eat you." He, like the other whistleblowers, advocates for robust external regulation to address the pervasive harms facilitated by these platforms.
The collective testimonies of these former insiders paint a deeply concerning picture of the social media landscape. Their accounts reveal a pattern of platforms prioritizing profit over user safety, allowing harmful content to proliferate, and actively resisting meaningful change. The whistleblowers’ calls for external regulation underscore the urgent need for intervention to protect users, particularly young people, from the insidious dangers lurking within these powerful platforms.