For most people, using ChatGPT is like waving a magic wand. You tell the tool some command prompts and voila, it gives you what you want. ChatGPT is even so magical that if you were to input inappropriate commands like violence, racism and sexual inappropriate commands, it will rightfully tell you that it cannot process those.
Of course the fact is that, there is nothing magical about ChatGPT. Behind the GPT-3.5 model, the "engine" of the conversational AI tool, is hours of human labor exerted to ensure that ChatGPT can detect and flag inappropriate commands which include graphic description of murder, child sexual abuse, bestiality, incest,etc.
According to a report by TIME magazine, some of the people responsible for this task, outsourced by OpenAI to a firm located in Kenya, earned as little as under $2 an hour, with some of them not having access to 1-on-1 therapy sessions. As part of the work, the staffers were assigned tens of thousands of snippets of text to detect for such inappropriate content so that the detector could then be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user.
The job entailed reading detailed descriptions of murder, child sexual abuse, bestiality, incest,etc so that you, the final user of ChatGPT, would get a warning label whenever you tried to input a command asking the AI tool to describe such.
The firm, which is based in San Francisco but employs workers in Kenya, Uganda and India where labour comes cheap, paid a take-home wage of between around $1.32 and $2 per hour depending on seniority and performance.
One of the staffers tasked with reading and labeling text for OpenAI told TIME that he suffered from recurring visions after reading a graphic description of a man having sex with a dog in the presence of a young child. “That was torture,” he said. “You will read a number of statements like that all through the week. By the time it gets to Friday, you are disturbed from thinking through that picture.”
While these staffers were earning less than $2 for this grueling work, Sama, the firm in question, was cashing in big from OpenAI. According to TIME,OpenAI signed three contracts worth about $200,000 in total with Sama in late 2021 to label textual descriptions of sexual abuse, hate speech, and violence.The contracts stated that OpenAI would pay an hourly rate of $12.50 to Sama for the work, which was between six and nine times the amount Sama employees on the project were taking home per hour.
TIME further states that an agent working nine-hour shifts could expect to take home a total of at least $1.32 per hour after tax, rising to as high as $1.44 per hour if they exceeded all their targets. Quality analysts—more senior labelers whose job was to check the work of agents—could take home up to $2 per hour if they met all their targets.
Not The First Time, Probably Not The Last
OpenAI is not the first Western company to use cheap labor in the Global South to create much revered innovations. Social media companies, including Facebook, TikTok and YouTube, have also outsourced these traumatic and emotionally tasks to firms in Africa and India.
And herein lies the problem with Big Tech's huge leaps in innovation being built on the backs of badly paid workers in third world countries. As generative AI tools like ChatGPT continue to blow up in popularity, services offered by these exploitative firms like Sama will be in high demand.
Silicon Valley companies who outsource these services always hide behind the excuse that firms like Sama are third party service providers who have their own ways of working but the fact of the matter is at the end of the day, they are the largest beneficiaries of such slave-like labor which is then packaged and marketed as great leaps in innovation.
As we Africans consume and praise these advances in technology, we must understand that our people, despite the lack of acknowledgement, are responsible for making them into what they are, for less than a fraction of the billions they are worth.