Criminal logo

Meta Faces Lawsuits in Ghana Over Harm to Content Moderators: A Human Cost of the Digital Age

The Silent Suffering Behind Our Feeds: Ghana's Content Moderators Stand Up Against the Trauma They Were Forced to Endure

By Srayan ChowdhuryPublished 9 months ago 3 min read

Meta, the tech giant behind Facebook and Instagram, is once more being asked hard questions about how it treats the invisible people who keep our online spaces safe, this time from Ghana. In Accra, a group of former content moderators is preparing to take legal action, not for money alone, but to seek justice for the silent wounds they carry from the traumatic content they had to police every day.

Behind Every Safe Post Is a Human Being

For most of us, scrolling through Facebook or Instagram is mindless entertainment. However, there are thousands of real people who sit behind computer screens for hours to absorb the darkest parts of the internet in order to safeguard users like you and me, just like Solomon and Abel. Solomon, a young East African, once had aspirations of improving his life. He took a job as a content moderator for Majorel, a company contracted by Meta, working at a site in Ghana’s capital. He watched videos every day of murder, suicide, and child exploitation. Slowly, the images sank deep into his mind, invading even his dreams. Depression set in. Thoughts of suicide followed. After Solomon sought treatment for his major depressive disorder, he was eventually let go from his position without support and shown the exit. He wasn't by himself. Abel, another former moderator, shares a story that is comparable. He describes feeling "broken inside" after being subjected to cruelty and violence for months in exchange for a wage that was higher than Ghana's minimum wage but did not even come close to covering the emotional scars he would carry for the rest of his life.

Lawsuits Demand Accountability

Now, with the help of the UK-based nonprofit Foxglove and the Ghanaian law firm Agency Seven Seven, these former moderators are fighting back. Two lawsuits are on the horizon: one seeking compensation for Solomon's psychological trauma and another alleging that he was wrongfully fired for seeking help. Their demands are simple but profound: recognition, accountability, and real change.

They argue that Meta cannot simply hide behind outsourcing contracts to avoid responsibility. After all, they say, everything — from their work hours to the standards they enforced — was dictated by Meta. However, they were severely unprotected when it came to mental health support.

Working Conditions: A Life of Surveillance and Silence

Moderators lived together in cramped, often monitored housing. Some were packed two to a room, five to a flat. Privacy was a luxury they could not afford, even outside work hours. Mental health support, they claim, was a mere formality — often offered by people without proper medical training. Worse still, anyone who tried to talk about their pain was met with suspicion rather than empathy. They believed they were entering a respectable digital workplace when these moderators signed up. What many found instead was emotional isolation and a sense that they were expendable.

Majorel and Meta Respond

Both Meta and Majorel (owned by outsourcing giant Teleperformance) deny wrongdoing. Teleperformance insists that workers were treated fairly — citing air-conditioned apartments, gym facilities, and counseling services. Meta says it requires its partners to maintain “above-industry standards” for pay and support, and that confidentiality agreements protect both workers and users.

But for the men and women living the reality behind those corporate statements, the story feels very different.

A Pattern Emerging Across Africa

This isn’t the first time Meta has been called out. In Kenya, over 140 former moderators sued Meta and Sama, another outsourcing firm, alleging widespread PTSD and low wages. That case is still unfolding. The lawsuits filed in Ghana and Kenya together paint a worrying picture: that the tech industry frequently relies on outsourced moderators, which frequently results in people feeling abandoned and crushed emotionally.

A Time of Transition

What these lawsuits truly seek is not just money. They demand a different kind of reckoning — a call to tech giants to recognize that real human beings, not algorithms, are absorbing the worst of humanity every day to keep their platforms clean.

As Solomon and others bravely step forward, they force us all to confront an uncomfortable truth: in the digital age, invisible workers are paying an enormous price for our comfort online. The world and businesses like Meta need to pay attention.

celebritiesfictioninvestigationfact or fiction

About the Creator

Srayan Chowdhury

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Peace Oputa9 months ago

    Well-done, keep going, you've got this💪

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.