Machine learning systems power the wildly popular AI chatbot ChatGPT, but those systems are guided by human workers, many of whom aren’t paid particularly well. A new report from NBC News shows that OpenAI, the startup behind ChatGPT, has been paying droves of U.S. contractors to assist it with the necessary task of data labeling—the process of training ChatGPT’s software to better respond to user requests. The compensation for this pivotal task? $15 per hour.
“We are grunt workers, but there would be no AI language systems without it,” one worker, Alexej Savreux, told NBC. “You can design all the neural networks you want, and you can get all the researchers involved you want, but without labelers, you have no ChatGPT. You have nothing.”
Data labeling parses data samples to help automated systems better identify particular items within the dataset. Labelers will tag special items (be they distinct visual images or sections of text) so that machines can learn to identify them better independently. By doing this, human workers help automated systems more accurately respond to user requests, serving a significant role in training machine learning models.
But, despite the importance of this position, NBC notes that most moderators are not compensated particularly well for their work. In the case of OpenAI’s mods, the data labelers receive no benefits and are paid little more than what amounts to the minimum wage in some states. Savreux is based in Kansas City, where the minimum wage is $7.25.
As terrible as that is, it’s still an upgrade from how OpenAI used to staff its moderation teams. Previously, the company outsourced its work to moderators in Africa, where—due to depressed wages and limited labor laws—it could get away with paying workers as low as $2 per hour. It previously collaborated with a company called Sama, an American firm that says it’s devoted to an “ethical AI supply chain” but whose main claim to fame is connecting big tech companies with low-wage contractors in Third World countries.
Sama was previously sued and accused of providing poor working conditions. Kenya’s low-paid mods ultimately helped OpenAI build a filtration system to weed out obscene or offensive material submitted to its chatbot. However, the low-paid moderators had to wade through screenfuls of said abusive material, including descriptions of murder, torture, sexual violence, and incest.
Artificial intelligence may seem like magic—springing to life and responding to user requests as if by magic—but, in reality, it’s being helped along by droves of invisible human workers who deserve better for their contribution.
Call for Fair Compensation for AI Workers
The Partnership on AI warned in a 2021 report that a spike in demand was coming for “data enrichment work.” It recommended that the industry commits to fair compensation and other improved practices, and last year it published voluntary guidelines for companies to follow. DeepMind, an AI subsidiary of Google, is the only tech company to commit to those guidelines publicly.
“A lot of people have recognized that this is important to do,” said Sonam Jindal, the program lead for AI, labor, and the economy at the Partnership on AI. “The challenge now is to get companies to do it. This is a new job that AI is creating. We have the potential for this to be a high-quality job and for workers to be respected and valued for their contributions to enabling this advancement.”
- The human workers responsible for labeling data for AI systems are often underpaid and underappreciated.
- This is a growing problem, as the demand for data labeling is increasing as AI systems become more sophisticated.
- Tech companies need to do more to ensure that the workers who are essential to the development of AI are fairly compensated and treated with respect.
- Some tech companies are starting to take steps in this direction, but more needs to be done.
The above article was written, edited, and reviewed with AI assistance by experienced CEO.com journalists and researchers to produce the most accurate and highest-quality information.