Big quote: "Workers absolutely should have the right to know what they are working on, and especially when moral or politically controversial activities are involved," said Juliet Schor, a professor of sociology at Boston College
Millions of workers in the so-called gig economy unknowingly helped Google assist the Pentagon in creating AI technology to be used in a controversial drone-targeting project.
The project, known as Project Maven, was part of a Department of Defense initiative last year. As part of the program, the DoD quietly retained Google's assistance in helping the AI distinguish between certain objects on the ground.
Gig workers are people who perform seemingly mundane tasks on a crowdsourcing platform, earning as little as $1 per hour for "micro-tasks". These micro tasks are part of a model known as "human-in-the-loop," a concept that calls for human intervention where computers usually fail. The company Google hired workers from, Figure Eight, position themselves as a machine learning company that "transforms unstructured text, image, audio, and video data into customized high-quality training data" for AI.
"Human-in-the-loop (HITL) is a branch of artificial intelligence that leverages both human and machine intelligence to create machine learning models," explains an excerpt from Figure Eight's marketing material. "In a traditional human-in-the-loop approach, people are involved in a virtuous circle where they train, tune, and test a particular algorithm."
The specific tasks Google hired the workers to perform involved labeling objects in satellite images to help the AI tell the difference between things like trees, buildings, vehicles, and other objects. The data obtained with the help of gig workers was then sent on to the Pentagon and incorporated into Project Maven, helping military drones engage in real-time analysis of satellite imagery.
In April of last year, Google CEO Sundar Pichai was presented with a petition signed by over 3,100 Google employees demanding that the company stop assisting the Pentagon with the drone project. In June, Google announced that it would not renew its government contract due to "internal pressure," meaning the contract is set to end at some point in 2019.
The issue of morality lies at the core of Google's decision to use gig workers to gather data. The Figure Eight workers had no idea what they were actually working on, who it was benefiting, or what they were helping to build. They also did not know that the Pentagon or Google were behind the work they were doing. Because these gig workers are contractors, they do not qualify for benefits or minimum wage.
A former Figure Eight contributor stated anonymously that, "Contributors to the Figure Eight platform are not given who the data will benefit. Usually, they are given a reason for why they are doing a task, like, 'Draw boxes around a certain product to help machines recognize it,' but they are not given the company that receives the data." Given the controversy surrounding the use of militant drones to execute enemy targets, the use of low-paid, uninformed workers seems egregious and immoral.
Google or Figure Eight have yet to comment publicly.