Skip to main content

By Chinasaokwu Helen Okoro

Content moderators have filed a lawsuit against tech giant Meta, claiming they suffered psychological harm from reviewing and removing disturbing material from social media platforms.

The moderators are employed by Majorel, an Accra-based company contracted by Meta to enforce its community standards by screening content.

According to the workers, their duties led to serious mental health issues, including depression, anxiety, insomnia, and substance abuse.

They allege that the mental health support provided by their employer was inadequate and that their calls for assistance were ignored.

Teleperformance, the parent company of Majorel, has disputed these allegations, according to The Guardian.

This lawsuit follows a similar case in Kenya, where over 100 Facebook content moderators were diagnosed with severe post-traumatic stress disorder after prolonged exposure to graphic content.

Social media companies often hire moderators to remove harmful or offensive material and to help train automated systems to detect such content.

Leave a Reply