
Meta, the parent company of Facebook, Instagram, and WhatsApp, is facing renewed legal challenges in Africa over its alleged role in the mental health crisis affecting its content moderators. These moderators are reportedly required to review highly disturbing material—including graphic violence, murder, and child sexual abuse—without sufficient psychological support. The lawsuits highlight the human cost of maintaining platform safety and raise critical questions about the ethical responsibilities of global tech firms towards African workers.
Ghana Joins Continental Legal Fight
Ghana is the latest country to sue Meta, joining ongoing legal battles in Kenya and South Africa. Ghanaian moderators allege the company breached its duty of care by exposing them to harmful content without adequate mental health safeguards. Plaintiffs report severe mental health issues similar to those documented in other African cases. These claims are backed by a joint investigation from The Guardian (UK) and the Bureau of Investigative Journalism.
The Ghana lawsuit intensifies calls for Meta to be held accountable for failing to protect those tasked with the hidden yet vital work of moderating online content. It adds to a growing continental movement pushing for better working conditions and accountability in digital labour.
Legal Action in South Africa and Kenya
In South Africa, a lawsuit claims that African content moderators were treated to a lower standard of care than those in other regions. Plaintiffs argue that Meta owes all workers—regardless of location—a fundamental duty of care, which current provisions fail to meet.
The legal fight began in Kenya, where current and former moderators filed a class action over poor working conditions and a lack of psychological support. Plaintiffs reported conditions leading to PTSD, anxiety, and depression. Their goal is to hold Meta accountable and compel the company to improve its protections for moderators.
Kenyan courts have also ruled that Meta can be held accountable for its alleged role in amplifying hate speech during the Ethiopia conflict—overruling Meta’s argument that Kenyan courts lacked jurisdiction.
The Hidden Toll of Content Moderation
Most content moderators are based in the Global South and serve as the first line of defence against violent and abusive material. Plaintiffs describe trauma, insomnia, and emotional distress, allegedly worsened by Meta’s failure to provide adequate psychological care.
Meta’s Response
Meta maintains that it prioritises the well-being of its moderation workforce. The company cites measures such as on-site wellness coaches, therapy access, and resilience training. A legal representative stated Meta is taking the accusations seriously and is committed to a safe, inclusive work environment.
Potential Defences
Meta may argue that it complies with local laws, that content reflects varying global standards, and that moderators were informed of potential risks in their contracts. The company is also likely to highlight its investments in mental health initiatives and argue it has exercised reasonable diligence in addressing harm.
Broader Implications for Tech Accountability in Africa
These lawsuits are part of a wider movement demanding accountability from multinational tech companies operating in Africa. As the continent’s digital footprint grows, so does the need for fair labour conditions and mental health support for digital workers.
Courts Face Ethical and Legal Tests
Courts in Kenya, South Africa, and Ghana must now weigh Meta’s operational needs against the mental health and human rights of its workers. These cases may define how global platforms balance business interests with employee welfare, particularly in emerging markets.
A Changing Legal Landscape
The legal actions in Ghana, South Africa, and Kenya reflect a growing demand for corporate accountability. If successful, they could set a powerful precedent for labour protections across the tech industry. Conversely, failure could reinforce inadequate systems and weaken global efforts to safeguard digital workers’ rights.
Sources:
- The Guardian and Bureau of Investigative Journalism joint investigation: Link
- Section 230 – U.S. Code
- Meta Workplace Terms of Service
DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.
DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.