Nairobi, Kenya — On the verge of tears, Nathan Nkunjimana remembered watching a video of a child being molested and a woman murdered.
Eight hours a day, her job as a content moderator for a Facebook contractor required her to witness horrors so the world wouldn’t have to. Some overwhelmed co-workers will scream or cry, he said.
Now, Nkunjimana is one of nearly 200 former Kenyan employees suing Facebook and local contractor Sama over working conditions that could have ramifications for social media moderators around the world. It is the first known court challenge outside the United States, where Facebook reached a settlement with arbitrators in 2020.
The group was assigned to the social media giant’s outsourced hub for content moderation in Kenya’s capital, Nairobi, where workers screen users’ posts, videos, messages and other content across Africa, weeding out any illegal or harmful content. Removes content that violates its Community Standards and Terms. Service.
Moderators from several African countries are calling for a $1.6 billion compensation fund, alleging poor working conditions including inadequate mental health support and low pay. Earlier this year, he was removed by Sama as he left the business of content moderation. They say the companies are ignoring court orders to extend their contracts till the matter is resolved.
Facebook and SAMA have defended their employment practices.
With little certainty of how long the case will take to conclude, medics expressed despair as money and work permits ran out and they grappled with the painful images that haunt them.
“If you feel comfortable browsing and seeing a Facebook page, it’s because someone like me is on that screen checking, ‘Is it okay to be here?'” of three children from Burundi Father Nkunjimana told The Associated Press in Nairobi.
The 33-year-old said content moderation is like being shot by “soldiers” for Facebook users, with workers spotting harmful content showing murder, suicide and sexual assault and ensuring it is removed.
For Nkunjimana and others, the job triggered a sense of pride, he said, feeling like they were “heroes to the community”.
But as the exposure of worrisome material rekindled past trauma for some like her who were struck by political or ethnic violence back home, moderators found little support and a culture of secrecy.
They were asked to sign non-disclosure agreements. Personal items such as phones were not allowed in the workplace.
After his shift, Nkuzimana would go home tired and would often lock himself in his bedroom to try to forget what he had seen. Even his wife had no idea what his job was like.
These days, he locks himself in his room to avoid his sons’ questions about why he is no longer working and why he can no longer pay the school fees. The content moderator salary was $429 per month, plus a small expatriate allowance for non-Kenyans.
Nkuzimana said the Facebook contractor, US-based SAMA, did little to ensure that post-traumatic professional counseling was offered to moderators in its Nairobi office. They said counselors were poorly trained to deal with what their colleagues were experiencing. Now, without any mental health care, he immerses himself in church instead.
Facebook’s parent company Meta has said its contractors are contractually obligated to pay their employees above the industry standard in the markets they operate in and provide on-site assistance by trained physicians.
A spokesperson said Meta Kenya could not comment on the matter.
In an email to the AP, Sama said the wages paid in Kenya are four times the local minimum wage and that “more than 60% of male workers and more than 70% of female workers live below the international poverty line (less than $1.90 per living below.” days)” before being hired.
Sama said all employees have unlimited access to one-on-one counseling “without fear of consequences”. The contractor also described a recent court decision to extend contracts for arbitrators as “confusing” and said a subsequent ruling blocking that decision meant it had not taken effect.
Sarah Roberts, a content moderation expert at the university, said such work has the potential to be “incredibly psychologically damaging”, but job seekers in low-income countries are taking the risk in exchange for office jobs in the tech industry. Can California, Los Angeles.
In countries like Kenya, where cheap labor is available in abundance, the outsourcing of such sensitive work is “the story of an exploitative industry based on using global economic inequality to its advantage, causing harm and then taking no responsibility because Companies can be.” Like, ‘Well, we never hired so-and-so, he was, you know, a third party,'” he said.
In addition, the mental health care provided may not be the “cream of the crop,” and concerns have been raised about medical privacy, said Roberts, associate professor of information studies.
The difference in the Kenyan court case, he said, is that mediators are organizing against their own terms and backtracking, creating unusual visibility. The usual strategy in such cases in the US is to settle, he said, but “if cases are brought in other places, it may not be so easy for companies to do so.”
Facebook invested in moderation centers around the world after it was accused of allowing hate speech to circulate in countries like Ethiopia and Myanmar, where conflicts are killing thousands and posting harmful content in different local languages. Went.
Seeking her fluency in various African languages, the content moderator hired by Sama in Kenya soon found herself seeing graphic content that hit too close to home.
The two years that Fasika Gebrekidan served as moderator largely overlapped with the war in the northern Tigray region of his native Ethiopia, where hundreds of thousands of people were killed and many Tigreans like him worried about the fate of their loved ones. I knew very little.
Already suffering from the trauma of fleeing the conflict, the 28-year-old spent her workday watching “horrific” videos and other material related to the war, including rape. With the video, he had to watch the first 50 seconds and the last 50 seconds to reach a decision whether it should be removed or not.
The feeling of gratitude he had for getting the job vanished immediately.
Phasika said, “You run from war, then you have to see war.” “It was just a torture for us.”
Now he has no income and no permanent home. She said she would look for new opportunities if she could feel normal again. A former journalist, she can no longer bring herself to write, even to express her feelings.
Fasika worries that “this trash” will always be on her mind. While talking to the AP, his eyes fell on a painting in the cafe, which featured a crimson portrait of a man in distress. This bothered her.
Fasica blames Facebook for the lack of proper mental health care and pay, and accuses a local contractor of using her and letting her go.
“Facebook should know what’s going on,” he said. “They should care about us.”
The fate of the moderator’s complaint rests with the Kenyan court, whose next hearing is scheduled for July 10.
The uncertainty is disheartening, Fasica said. Some moderators are giving up and returning to their countries, but that is still not an option for them.
,
AP Business Writer Calvin Chan contributed from London.