Meta Faces Second Legal Battle in Africa Over Content Moderator Trauma

Meta, the parent company of Facebook and Instagram, is embroiled in another legal battle in Africa over the psychological well-being of its content moderators. Following a lawsuit filed by Kenyan moderators earlier this year, a new legal action is brewing in Ghana, where moderators employed by a Meta contractor, Majorel, allege severe mental health consequences due to their exposure to graphic and disturbing content. The allegations paint a grim picture of working conditions, echoing the claims of Kenyan moderators who endured similar trauma. This unfolding situation raises serious questions about Meta’s responsibility for the welfare of its outsourced workforce tasked with keeping its platforms safe.

The Ghanaian moderators, numbering around 150, claim they experience depression, anxiety, insomnia, and substance abuse as a direct result of their work, which involves reviewing highly disturbing content, including depictions of murders, extreme violence, and child sexual abuse. One moderator, who remains anonymous for legal reasons, alleges he attempted suicide due to the trauma of his work and was subsequently dismissed. The legal action is being prepared by the UK-based non-profit Foxglove, in collaboration with the Ghanaian firm Agency Seven Seven. They are investigating allegations of psychological harm and unfair dismissal, with the potential for a group lawsuit representing the affected moderators.

The working conditions described by the Ghanaian moderators are deeply troubling. They report inadequate mental health support, delivered by non-medical professionals, and allege that confidential disclosures about their work-related trauma were shared amongst managers. Furthermore, they claim their base wages are below the cost of living in Accra, forcing them to work overtime at even lower rates, while facing penalties for failing to meet performance targets. The moderators are reportedly housed in cramped conditions, with multiple people sharing rooms, and subject to a culture of secrecy and surveillance by managers.

Teleperformance, the French multinational that owns Majorel, disputes these claims. The company states that moderators are paid significantly above minimum wage, provided with upscale housing that includes amenities like gyms and pools, and offered robust mental health support from licensed professionals with master’s degrees in relevant fields. Teleperformance maintains that moderators are fully informed about the nature of the content they will be reviewing and provides comprehensive wellbeing programs. They emphasize their transparency and commitment to supporting moderators throughout their employment.

This emerging legal battle in Ghana mirrors the ongoing case in Kenya, where over 140 content moderators employed by Samasource, another Meta outsourcing company, sued the company in 2023 after being diagnosed with severe PTSD. Both cases underscore the ethical and legal challenges of content moderation in the age of social media. The outsourcing of this crucial, yet psychologically demanding, work to countries with lower labor costs raises concerns about worker exploitation and the adequacy of protections for vulnerable employees.

Foxglove and Agency Seven Seven are seeking court-ordered improvements to the moderators’ workplace, including robust mental health safeguards and appropriate psychiatric care. They argue that current Ghanaian law needs to catch up with the realities of virtual work and recognize psychological harm as a valid basis for legal action. This case has the potential to set a precedent for worker protections in the digital age, extending beyond physical injury to encompass the psychological toll of content moderation. The outcome of this litigation could significantly impact how social media companies manage their content moderation workforce globally. The ongoing legal battles in Africa highlight the urgent need for improved industry standards and greater accountability from tech giants like Meta to ensure the well-being of their content moderators.

Share.
Exit mobile version