When Olumide Adeniran started working at Webhelp’s Cape Town office in May 2022, he joined what seemed like a burgeoning tech opportunity. As a content moderator for ByteDance’s TikTok platform, focusing on Nigerian-language content, he was part of Africa’s growing role in the global digital economy. Eighteen months later, he was unemployed, fighting a legal battle over his dismissal, and the recipient of a court judgment that exposed uncomfortable truths about the continent’s position in the tech ecosystem.
The Labour Court of South Africa’s January 2026 ruling in Adeniran’s case offers a sobering glimpse into the vulnerability of African workers in the global content moderation industry. What began as a dispute over a single retrenchment ultimately illuminated a much larger story about precarious employment, unilateral contract cancellations, and the fragility of Africa’s tech talent dream.
The Anatomy of a Vanishing Job
The facts are straightforward but stark. In September 2022, ByteDance informed Webhelp it would discontinue the Nigerian-language component of its content moderation campaign. The reason: low content volumes and an inability to recruit the full team of 20 moderators required. At the time, only nine moderators and one quality assurance employee were working on the campaign.
By November 2022, all were out of work.
Adeniran, who held a Bachelor of Commerce degree in Management Marketing, believed his dismissal was unfair. He sought reinstatement in a finance or management position with “excellent pay” and restoration of benefits including medical aid and funeral insurance. He initially claimed R10 million in damages, citing depression, trauma, and alleged defamation.
The court disagreed. Judge Lagrange ruled the retrenchment was both substantively and procedurally fair. The reasoning was clinical: when ByteDance terminated the contract, there was no longer any work for Nigerian language moderators. Adeniran had been offered alternative positions, including roles in customer service campaigns, but declined them due to salary reductions of roughly one-third.
Most tellingly, Adeniran acknowledged in cross-examination that the termination of the ByteDance campaign meant “there was no work for the Nigerian language moderators and that retrenchment had to be considered.”
A Continent’s Pattern of Precarity
Adeniran’s case is far from unique. It exemplifies a growing pattern across Africa’s content moderation sector, where workers from Nairobi to Lagos to Cape Town perform essential but undervalued work for global tech giants, only to find themselves disposable when business priorities shift.
In Kenya, which has emerged as Africa’s primary content moderation hub, workers have faced similar instability. When Sama, an outsourcing company that employed content moderators for Facebook and ChatGPT, ended its Facebook contract in March 2023, 260 workers lost their jobs. Former moderator Daniel Motaung has become a prominent figure in exposing working conditions, filing lawsuits alleging psychological trauma from exposure to disturbing content.
James Oyange Odhiambo, a former TikTok content moderator in Kenya employed through Majorel (another outsourcing firm), described reading hundreds of descriptions of child exploitation daily for approximately $1.50 per hour while working for ChatGPT. “If you put so much dirty content in your mind, it changes you,” he told researchers. His contract was not renewed in April 2023 after he advocated for better working conditions.
In May 2023, more than 150 content moderators across platforms including Facebook, TikTok, and ChatGPT voted to form the African Content Moderators Union, demanding better pay, mental health support, and labour protections.
The Economics of Outsourcing
The business model is clear: global platforms outsource content moderation to jurisdictions with lower labour costs and fewer regulatory protections. While content moderators in Europe and the United States earn upwards of $20 per hour, their African counterparts typically receive between $1.50 and $2.20 hourly for identical work.
For companies like ByteDance, Meta, and OpenAI, Africa offers a skilled, multilingual workforce at a fraction of Western costs. South Africa provides an educated English-speaking population with what HR managers describe as a “neutral” accent desirable for customer service roles. Kenya’s tech ecosystem attracts workers from across the continent to moderate content in diverse African languages.
But this arrangement creates profound asymmetries of power. When ByteDance decided Nigerian-language content volumes were too low to justify a full team, it simply cancelled the contract. Workers who had relocated, built lives, and depended on stable employment found themselves without recourse.
The Adeniran judgment reveals how outsourcing structures shield platforms from responsibility. Webhelp, not ByteDance, was the named defendant. The workers’ employer was a South African company; their work benefited a Chinese tech giant. When the contract ended, neither entity bore any obligation beyond statutory severance pay — in Adeniran’s case, two weeks’ wages.
The Illusion of Alternative Employment
A particularly striking aspect of the court case was the question of alternative positions. Webhelp argued it had made reasonable efforts to redeploy affected workers, offering roles in a General Electric healthcare campaign requiring Nigerian language skills and customer service positions for English speakers.
Adeniran declined the GE position, stating he had no interest in it. He also rejected the customer service roles, which would have paid approximately R7,000 monthly compared to his previous R9,000 salary — a reduction he deemed unacceptable.
The court sided with Webhelp. Judge Lagrange noted that even if the English proficiency assessment for customer service roles had been conducted unfairly, this didn’t matter because Adeniran “still would not have accepted appointment in the campaign because the salary was unacceptably low.”
This reasoning exposes a cruel bind. Workers displaced by contract cancellations face a choice: accept significantly lower wages in unfamiliar roles, or pursue legal remedies that courts are unlikely to support. The judgment essentially validates a system where workers must either accept whatever alternative employment is offered, regardless of pay cuts, or be found to have voluntarily removed themselves from consideration.
The Automation Threat
The precarity facing African content moderators is intensifying as platforms increasingly turn to artificial intelligence. In October 2024, TikTok laid off hundreds of content moderation employees, primarily in Malaysia, as part of a shift toward automated moderation. The company stated that 80% of violative content is now removed by automated technologies.
This trend presents a paradox. African workers are employed to train AI systems, labelling and categorizing content to improve algorithmic moderation. Yet the same systems they help develop will ultimately replace them. It’s a pattern reminiscent of historical extractive relationships, where African labour and resources support technological advancement elsewhere while providing limited lasting benefit locally.
For platforms, AI moderation offers obvious advantages: lower costs, unlimited scalability, no mental health liabilities, no unions to negotiate with. For African workers, it represents an existential threat to an already precarious employment sector.
Policy Vacuum
One striking finding from research is the policy vacuum surrounding digital labour in most African countries. While Kenya has launched some initiatives to support online freelancers and Nigeria has invested in tech hubs, comprehensive frameworks protecting platform workers remain largely absent.
South Africa, despite contributing to the global freelance economy and hosting significant outsourcing operations, has yet to articulate a clear policy vision for gig workers. There are no government grants tailored to digital freelancers, no tax incentives, no national skills programmes specifically designed for platform work.
This absence of policy recognition means digital workers exist in a regulatory grey zone. They’re not treated as traditional employees entitled to benefits and protections, yet they lack the autonomy genuinely independent contractors might possess. Platforms dictate rates, working conditions, and performance metrics while disclaiming employment relationships.
The International Labour Organization has begun developing standards for platform work, and some jurisdictions are taking action. France has led efforts within the European Union to reclassify gig workers as employees. The EU Platform Work Directive, adopted in 2024, introduces a presumption of employment for platform workers unless proven otherwise.
But such protections remain distant for most African workers. Countries compete to attract outsourcing investments by offering low-cost labour and minimal regulation. This race to the bottom leaves workers vulnerable to precisely the kind of sudden contract cancellations that eliminated Adeniran’s job.
The Bottom Line
The contradictions are becoming harder to ignore. As AI systems trained by African workers increasingly replace those same workers, as courts validate dismissals that leave skilled workers unemployed, as the promise of tech jobs proves hollow for many who pursue them, questions multiply.
Can Africa build a digital economy that serves African interests, or will it remain a source of cheap, contingent labour for platforms headquartered elsewhere? Can policy frameworks be developed that provide workers meaningful protections without driving away investment? Can organizing efforts build sufficient power to shift the asymmetric relationships between workers and platforms?
The Adeniran judgment offers no answers to these questions. It simply applied existing law to existing employment relationships and found them adequate. But adequacy is measured by legal standards, not by whether outcomes serve human dignity or economic justice.
Perhaps the real lesson is that relying on labour law alone is insufficient. The structures that left Adeniran and his colleagues vulnerable — outsourcing arrangements, fixed-term contracts, unilateral cancellation clauses, jurisdiction shopping — are features, not bugs, of the global platform economy.
Addressing them requires more than better severance packages or fairer selection criteria. It requires confronting fundamental questions about power, ownership, and value in the digital economy. It requires recognizing that when platforms can terminate hundreds of jobs with a single contract cancellation, something more than procedural fairness is at stake.
For now, Adeniran’s case stands as a documentation of what happens when Africa’s tech talent dream meets global capitalism’s priorities. The court found his dismissal fair. But fairness, as the judgment makes clear, is a question of legal compliance, not a measure of justice or sustainability.
The hundreds of thousands of Africans working in content moderation, data annotation, and other forms of digital labour might draw their own conclusions about what the case reveals. And increasingly, they’re organizing to change the story’s ending.
The Labour Court ruling in Adeniran v Webhelp SA Outsourcing (Pty) Ltd was issued on January 19, 2026, in Cape Town.

