
Kenyan Court Halts Mass Layoff of Facebook Moderators, Orders Support and Care

Kenyan Court Protects Moderators’ Rights and Mental Health
In a landmark ruling, a Kenyan court has ordered the suspension of the mass layoff of content moderators employed by Sama, a subcontractor for Meta, the parent company of Facebook. The court also directed Meta to provide counseling and support to the affected moderators. The 184 moderators had filed a lawsuit in March, challenging the legality of their dismissal and claiming it was unlawful.
The labour court judge, Byram Ongaya, issued an interim order restraining Meta and Sama from terminating the moderators’ contracts until the lawsuit is settled. The judge further ordered that any contracts set to lapse during this period be extended. Additionally, Facebook’s new outsourcing firm, Majorel, was barred from blacklisting the moderators from future roles.
The court’s ruling also emphasized the responsibility of Meta to provide proper medical, psychiatric, and psychological care for the moderators and other Facebook content moderators. This decision acknowledges the potential negative impact of content moderation work on the mental well-being of the employees and highlights the need for proper support in this field.
Landmark Ruling Challenges Facebook’s Outsourcing Model
The Kenyan court’s decision has far-reaching implications for the social media and AI industry, as it establishes Facebook’s accountability as the true employer of its moderators. The ruling deals a major blow to the outsourcing model employed by Facebook to avoid responsibility for the well-being of its key safety workers. The legal activist firm, Foxglove, supporting the case, welcomed the court’s decision, recognizing its significance in holding tech giants accountable for the treatment of their employees.
This ruling comes in the wake of increased scrutiny on Meta and other tech companies regarding the working conditions and mental health support provided to content moderators. Content moderation involves exposure to hateful and disturbing posts, often for extended periods, which can take a toll on the moderators’ mental health. Meta is also facing two other legal cases in Kenya, including one related to poor working conditions and a lack of mental health support.
As the world continues to grapple with the challenges of online safety and responsible content moderation, this court ruling sheds light on the importance of ensuring the well-being and rights of those tasked with this critical job. It serves as a reminder that tech giants must take their corporate responsibility seriously and prioritize the support and protection of their workers.
Subscribe to BNN Breaking
Sign up for our daily newsletter covering global breaking news around the world.
Comments