Kenyan moderators behind ChatGPT want parliament to probe OpenAI, Sama over exploitation
The Kenyan content moderators who built the Artificial Intelligence (AI) chatbot ChatGPT have filed a petition in parliament seeking a probe into the bot’s parent company OpenAI and its local moderation partner Samasource.
The Sama employees allege exploitation and underpayment during the creation of the popular chatbot and want the Kenyan government to investigate the companies and regulate the work of tech companies operating in the country.
In court documents seen by Citizen Digital, Richard Mwaura Mathenge, Mophat Ochieng Okinyi, Alex Mwaura Kairu and Bill Kelvin Mulinya on behalf of the moderation team say in 2021, the US AI company partnered with Sama.
Sama was outsourced to provide the workforce who would train and clean up the now very popular chatbot, which was launched in November last year, to communicate and interact with people as if it were a person itself.
“Sama engaged us and other young Kenyans on temporary contracts to do this work. The contracts did not describe sufficiently the nature of the job,” the moderators say.
They claim they were not properly informed of the nature of the work we would be undertaking, which involved reading and viewing material that depicted sexual and graphic violence and categorizing it accordingly so that ChatGPT's AI could learn it for purposes of its future interactions with people.
“Examples of the content that we were exposed to include; acts of bestiality, necrophilia, incestuous sexual violence, rape, defilement of minors, self-harm (e.g. suicide), and murder just to mention a few,” they say.
-'No psychological support'-
All through the ChatGPT training process, the workers say they were not afforded psychosocial support and that due to the exposure to the work, they have developed severe mental illnesses including PTSD, paranoia, depression, anxiety, insomnia and sexual dysfunction.
Additionally, the moderators say the contract between OpenAI and Sama was terminated abruptly, making them to be sent home despite the fact that we were already suffering from severe mental illness.
“The outsourcing model is commonly used by big technological companies based in the United States to export harmful and dangerous work to Kenyan youth. The outsourced workers are paid poorly and are not provided with the care they need to undertake such jobs. They are disposed of at will,” reads court documents.
The moderators also express concerns that the outsourcing model makes workers treated poorly and not afforded the same protection as full-time employees.
They now want the National Assembly Public Petitions Committee to probe the nature and conditions of work and the operations of Sama as well as other companies operating in Kenya, to whom big tech companies such as Meta, TikTok, Google, Microsoft and Open Al outsource their content moderation and other AI work.
The team further wants parliament to interrogate the role of the Ministry of Labour in the protection or otherwise of Kenyan youth who were working for Sama and other companies, on behalf of technological companies based outside Kenya.
In their view, the licences of companies that contribute to the exploitation of Kenyan youth should be withdrawn.
The moderators want parliament to enact laws regulating the outsourcing of harmful and dangerous technology work and protecting workers who are engaged through such arrangements and amend the Employment Act to offer protection to workers who are engaged through outsourcing agreements.
“Amend the Occupational Health Act to include exposure to harmful content as an occupational hazard,” they say in their prayers to the National Assembly.
Their petition comes two months after over 150 moderators working for Facebook, TikTok, and ChatGPT resolved to register a workers union dubbed the Content Moderators Union, drawing moderators from any major tech firm.
This was in the wake of a legal battle by Sama’s moderators who did content moderation for Facebook parent company Meta and allege poor working conditions in the Nairobi moderation hub.
In 2019, a Sama moderator called Daniel Motuang started a group that tried to negotiate over unfair conditions like pay and mental health care.
The workers say Facebook and Sama ignored their demands, instead destroying the union and forcing Motuang to leave Kenya.
But after a 2022 Time Magazine exposé that lifted the lid on the exploitation of African Facebook moderators in Nairobi, a wave of legal action and organising took off and has culminated in two judgments by Kenyan courts against Meta.
No comments yet.