Meta, the parent company of Facebook, WhatsApp, and Instagram, is facing legal challenges in Kenya regarding the treatment of content moderators. Three complaints have been filed against Meta and its subcontractor, Sama, responsible for moderating content on Facebook in sub-Saharan Africa.
Two complaints were lodged by content moderators in Nairobi, who were tasked with removing violent, hateful, and false content from the platform.
One complaint, filed in May 2022 by a South African employee named Daniel Motaung, alleges inhumane working conditions, misleading hiring practices, inadequate compensation, and a lack of psychological support.

Motaung claims to have been dismissed after attempting to form a trade union. In March, a second complaint was filed by 184 employees who were wrongfully terminated by Sama. They are seeking compensation for inadequate wages and the mental health toll of their work.
Also, read; Brazil’s President Lula Embarks on Africa Trips to Strengthen Ties
The Employment Tribunal has suspended the dismissals and ordered Meta and Sama to provide appropriate psychological and medical care to the plaintiffs.
Both Meta and Sama have announced their intention to appeal the ruling. Another complaint, filed in December 2022, accuses the companies of negligence in addressing hate speech, which allegedly contributed to the murder of a university professor in Ethiopia in 2021.

Critics argue that Meta’s outsourcing system is an attempt to evade responsibility. Meta outsources content moderation to companies in over 20 locations worldwide, handling millions of items daily.
These cases represent significant legal challenges to content moderation practices and echo the class action lawsuit filed against Facebook in the USA in 2018, where the company agreed to pay $52 million in compensation to moderators for the impact on their mental health.
