Kenyan Facebook content moderators describe work as ‘torture,’ file lawsuit with potential global impact

June 29, 2023
kenyan-facebook-content-moderators-describe-work-as-‘torture,’-file-lawsuit-with-potential-global-impact

On the brink of breaking down, Nathan Nkunzimana described the chilling experience of viewing a video of a child’s exploitation and a woman’s murder.

His role as a Facebook content moderator contracted through a local firm, mandated him to bear witness to such traumatic content for eight hours daily, all to shield the rest of the world from such horrors. His colleagues were so affected that some would break down in tears or scream out loud, Nkunzimana noted.

Currently, Nkunzimana joins almost 200 other former Kenyan employees in a landmark legal action against Facebook and its local contractor, Sama, citing working conditions that might have repercussions for content moderators globally. This lawsuit is the first of its kind outside the United States, where Facebook settled a similar case with its moderators in 2020.

These employees worked at Facebook’s content moderation hub in Nairobi, Kenya, where they were responsible for screening posts, videos, messages, and other content from Africa. Their duty was to remove any content breaching Facebook’s community guidelines and terms of service.

The plaintiffs, comprising moderators from different African nations, seek a $1.6 billion compensation fund. They allege low wages, inadequate mental health support, and poor working conditions. These workers were laid off earlier this year when Sama withdrew from content moderation operations. The former employees claim both firms have disregarded a court injunction to extend their contracts until the lawsuit concludes.

Both Facebook and Sama have defended their employment standards.

As uncertainty lingers about when the case will conclude, the moderators are increasingly anxious as they struggle with expiring work permits, dwindling finances, and the haunting memories of traumatic content.

Nkunzimana, a father of three from Burundi, explained his work to The Associated Press, saying, “If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, ‘Is this okay to be here?’”

He likened content moderation to soldiers taking a bullet for Facebook users, screening disturbing content depicting violence, suicide, and sexual assault to ensure its removal.

Initially, the job brought a sense of pride to Nkunzimana and others. Still, prolonged exposure to distressing content and lack of support led to a culture of secrecy and little professional help.

After gruelling shifts, Nkunzimana would often retreat to his bedroom to distance himself from the day’s horrors, a reality even his wife wasn’t aware of.

Now, he isolates himself to avoid his sons’ questions about his joblessness and the likelihood of being unable to pay for their education. The monthly wage for content moderators stood at $429, with a minor ex-pat allowance for non-Kenyans.

Nkunzimana criticized the U.S.-based contractor, Sama, for neglecting to provide substantial professional post-traumatic counselling for Nairobi-based moderators. He stated that the available counsellors were ill-equipped to handle the trauma experienced by his colleagues. Deprived of professional mental health support, he now seeks solace in his faith.

Facebook’s parent company, Meta, mandates contractors to pay their employees above the industry standard in their markets and offer on-site support from trained professionals.

A spokesperson said Meta was unable to comment on the ongoing Kenyan lawsuit.

Sama told AP via email that their salaries in Kenya exceeded the local minimum wage by four times and that most of their employees lived below the international poverty line before employment.

The contractor confirmed that all employees had unrestricted access to one-on-one counselling “without fear of

 repercussions.” Sama described a recent court decision to extend the moderators’ contracts as “confusing” and clarified that a subsequent ruling pausing the decision meant it hadn’t been enforced yet.

Sarah Roberts, a content moderation expert from the University of California, Los Angeles, highlighted the potential psychological damage of the job, which people in lower-income nations might overlook for a position in the tech industry.

She argued that outsourcing such sensitive tasks in countries like Kenya is an exploitative industry exploiting global economic disparities to avoid accountability.

The Kenyan lawsuit is unique in that the moderators actively fight against their conditions, creating unusual visibility. Roberts observed that companies often settle such cases in the U.S., but lawsuits in other locations might not allow such easy resolutions.

Facebook established moderation hubs worldwide after being accused of failing to curb hate speech in countries like Ethiopia and Myanmar, where conflicts resulted in mass casualties and harmful content was disseminated in various local languages.

The multilingual content moderators in Kenya, hired for their proficiency in various African languages, were soon exposed to graphic content that was painfully personal.

Fasica Gebrekidan, an Ethiopian native, served as a moderator during the war in the northern Tigray region of her homeland, leaving her to grapple with the reality of the war during her work hours while her loved ones’ fate remained uncertain.

Having escaped the war, Fasica described the job as “just torture” because she still witnessed it through her work. She now finds herself without a stable income or a permanent home, too traumatized to return to her former occupation as a journalist.

She faults Facebook for failing to provide adequate mental health care and wages and accuses the local contractor of exploiting her.

The Kenyan court will decide the outcome of the moderators’ complaint, with the next hearing scheduled for July 10.

The uncertainty is frustrating for Fasica, who notes that some moderators have given up and returned to their home countries. However, that is not currently an option for her.

The conclusion of this case could have far-reaching effects on social media platforms and third-party contractors alike, bringing to light the need for adequate mental health support and fair compensation for those who work in content moderation. As the legal battle plays out, it’s evident that the mental toll these moderators endure daily is immense. This case underscores the importance of companies ensuring their employees are protected, adequately compensated, and provided with necessary mental health resources. The struggle of these Kenyan moderators is a stark reminder of the unseen faces that shield us from the horrifying aspects of our digital world.

Latest from Business

withemes on instagram

[instagram-feed feed=1]