The Kenyan courts have dealt a significant setback to Meta, the social media giant, by ruling it as the primary employer of content moderators who sued both Meta and its African content review partner, Sama, for unlawful dismissal. This development has far-reaching implications for Meta’s content moderation operations in Africa, and it sheds light on the complex relationships between Meta, Sama, and the moderators.
Background: The Lawsuit and Its Origins
The lawsuit originated in March when 184 content moderators filed a suit against Meta and Sama, alleging unlawful dismissal by Meta’s new content review partner in Africa, Majorel. These moderators were responsible for reviewing social media posts on Meta’s platforms, filtering out content that incited hate, misinformation, and violence.
The moderators claimed that Sama had illegally fired them without issuing redundancy notices, as required by Kenyan law. They further alleged that signing non-disclosure documents was a prerequisite for their terminal dues and that a 30-day termination notice was not provided to them.
Meta’s Attempts to Distance Itself from the Case
Meta sought to distance itself from the case, arguing that it was not the moderators’ employer. However, Justice Byram Ongaya of Kenya’s employment and labor relations court ruled on Friday that Meta was indeed the primary employer of the content moderators. This decision was based on the fact that the moderators did Meta’s work, used its technology, and adhered to its performance and accuracy metrics. The court determined that Sama was “merely an agent…or manager” and not legally empowered to act on Meta’s behalf.
The Court’s Decision: Implications for Meta and Sama
The court’s ruling has significant implications for Meta and Sama. The court has directed that the moderators’ contracts be extended and has barred Meta and Sama from laying off the moderators pending the outcome of the case. This is because the court found no suitable justification for the redundancies, stating that the job of content moderation is still available and that the moderators should continue working under the same or better terms.
Meta’s Legal Troubles in Kenya
This case is not the only legal challenge Meta is facing in Kenya. Two other lawsuits have been filed against the company:
- Daniel Motaung, a South African, sued Meta for labor and human trafficking, unfair labor relations, union busting, and failure to provide adequate mental health and psychosocial support. Motaung alleges that he was laid off for attempting to unionize Sama’s employees and organizing a 2019 strike.
- A group of Ethiopians filed a lawsuit in December last year, claiming that Meta failed to implement sufficient safety measures on Facebook, which in turn fueled conflicts that led to the deaths of over 500,000 Ethiopians during the Tigray War, including the father of one of the petitioners.
Sama’s Response to the Allegations
Sama has disputed the court’s ruling, stating that Meta is a client of Sama’s and that Sama is not legally empowered to act on behalf of Meta. In the past, Sama has also claimed that it observed Kenyan law by communicating its decision to discontinue content moderation through a town hall, email, and notification letters.
Sama, which also has clients such as OpenAI, decided to drop Meta’s contract and content review services, issuing redundancy notices to 260 moderators. The company has chosen to concentrate on labeling work, specifically computer vision data annotation.
The Broader Context: Content Moderation and Social Media
This case highlights the complex and often controversial nature of content moderation on social media platforms. Content moderators play a crucial role in ensuring that social media remains a safe space for users by filtering out harmful content. However, the working conditions and treatment of these moderators have come under scrutiny in recent years.
The Global Content Moderation Landscape
Content moderation is not just an issue in Africa; it is a global concern. Meta employs thousands of content moderators worldwide to monitor and remove inappropriate content from its platforms. These moderators are employed either directly by Meta or indirectly through third-party companies, such as Sama.
Challenges Faced by Content Moderators
Daily exposure to graphic and disturbing content can significantly impact the mental health of content moderators. Additionally, they may face job insecurity, low pay, and high-stress working conditions. These challenges have led to lawsuits and labor disputes in various countries, bringing the issue of content moderation to the forefront of public discourse.
The Future of Content Moderation and Meta’s Role
As the primary employer of content moderators in this case, Meta is now responsible for addressing the concerns raised by the moderators and ensuring that their working conditions and treatment are fair and just. This ruling could set a precedent for other cases involving content moderation and the responsibilities of social media companies.
Meta’s Commitment to Content Moderation
Meta has previously announced plans to invest in AI and machine learning technologies to improve content moderation on its platforms. However, human content moderators will continue to play a vital role in ensuring the safety and integrity of social media.
Potential Impacts on the Content Moderation Industry
The Kenyan court’s ruling could have broader implications for the content moderation industry, as it may encourage other content moderators to seek legal recourse for their grievances. This could ultimately lead to greater scrutiny of social media companies and their content moderation practices, potentially prompting industry-wide changes.
Conclusion
The Kenyan court’s ruling that Meta is the primary employer of content moderators in this case has far-reaching implications for the company and its content moderation operations in Africa. Ultimately, this case highlights the need for greater transparency and accountability in the content moderation industry. It also emphasizes the importance of safeguarding the rights and well-being of content moderators who play a crucial role in maintaining the safety and integrity of social media platforms.