1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Low-paid workers unknowingly helped Google build controversial AI for military drones

By DPennington · 12 replies
Feb 5, 2019
Post New Reply
  1. Millions of workers in the so-called gig economy unknowingly helped Google assist the Pentagon in creating AI technology to be used in a controversial drone-targeting project.

    The project, known as Project Maven, was part of a Department of Defense initiative last year. As part of the program, the DoD quietly retained Google's assistance in helping the AI distinguish between certain objects on the ground.

    Gig workers are people who perform seemingly mundane tasks on a crowdsourcing platform, earning as little as $1 per hour for "micro-tasks". These micro tasks are part of a model known as "human-in-the-loop," a concept that calls for human intervention where computers usually fail. The company Google hired workers from, Figure Eight, position themselves as a machine learning company that "transforms unstructured text, image, audio, and video data into customized high-quality training data" for AI.

    “Human-in-the-loop (HITL) is a branch of artificial intelligence that leverages both human and machine intelligence to create machine learning models,” explains an excerpt from Figure Eight's marketing material. “In a traditional human-in-the-loop approach, people are involved in a virtuous circle where they train, tune, and test a particular algorithm.”

    The specific tasks Google hired the workers to perform involved labeling objects in satellite images to help the AI tell the difference between things like trees, buildings, vehicles, and other objects. The data obtained with the help of gig workers was then sent on to the Pentagon and incorporated into Project Maven, helping military drones engage in real-time analysis of satellite imagery.

    In April of last year, Google CEO Sundar Pichai was presented with a petition signed by over 3,100 Google employees demanding that the company stop assisting the Pentagon with the drone project. In June, Google announced that it would not renew its government contract due to "internal pressure," meaning the contract is set to end at some point in 2019.

    The issue of morality lies at the core of Google's decision to use gig workers to gather data. The Figure Eight workers had no idea what they were actually working on, who it was benefiting, or what they were helping to build. They also did not know that the Pentagon or Google were behind the work they were doing. Because these gig workers are contractors, they do not qualify for benefits or minimum wage.

    A former Figure Eight contributor stated anonymously that, “Contributors to the Figure Eight platform are not given who the data will benefit. Usually, they are given a reason for why they are doing a task, like, ‘Draw boxes around a certain product to help machines recognize it,’ but they are not given the company that receives the data.” Given the controversy surrounding the use of militant drones to execute enemy targets, the use of low-paid, uninformed workers seems egregious and immoral.

    Google or Figure Eight have yet to comment publicly.

    Permalink to story.

  2. p51d007

    p51d007 TS Evangelist Posts: 1,889   +1,162

    LOL, the little snowflakes heads will probably retroactively explode.
    btfsttg and psycros like this.
  3. brucek

    brucek TS Maniac Posts: 132   +166

    Not buying the moral outrage here.

    First, any contribution to anything meaningful in tech is ultimately going to be used thousands or millions of ways, many of them unanticipated and unknowable. Even if the Pentagon gets first crack I do not see how helping visual recognition identify say a tree is not going to be widely needed, widely used, and done by someone.

    Second, any high tech weapons development has always been need to know including limited information for sub-contractors. Nothing new here.

    And as to the Google folks refusing to help the defense of their company's host country, if it ends up being a human life that is later sacrificed to defend their family, I hope they'll at least send a thank you card. Or even if it's an enemy civilian killed needlessly by a dumber weapon that could've been made smarter with their help.
  4. Uncle Al

    Uncle Al TS Evangelist Posts: 5,132   +3,553

    Due to security requirements compartmentalization is VERY common and expected on any DOD projects that require security clearances. Anyone that works in such conditions are briefed on that fact as well as they will not know what they are working on (big picture). The only Moral Outrage here are those that simply want to scream and cry .... the very first to scream "protect us, protect us" in times where they may suffer the consequences ......
    Clamyboy74, btfsttg and psycros like this.
  5. m4a4

    m4a4 TS Evangelist Posts: 1,392   +963

    Yup. Too many people today act with feelings only, leaving facts and reason off to the side...
    Clamyboy74, btfsttg and psycros like this.
  6. psycros

    psycros TS Evangelist Posts: 2,618   +2,349

    "Given the controversy surrounding the use of militant drones to execute enemy targets, the use of low-paid, uninformed workers seems egregious and immoral."

    And the intellectual descent of Techspot continues. LOL, what exactly is a "militant drone"? Is the author so ignorant that he thinks drones are out there rampaging unattended? They can't even fire w/o human approval or pre-programming. Heaven forbid the military would try to improve target distinction for its automated weapons! I suppose that Pennington would rather have a drone AI asking its human controller, "Can I go ahead and wreck that APC?" when its actually an ice cream truck.
    btfsttg likes this.
  7. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,208   +4,877

    My complaint is what the hell are we paying them for if they have to out source code. Seems we are paying for incompetence plus outsourcing. Might as well fire the lot and put Google in charge. You know damn well they are putting back doors in the code.
  8. Puiu

    Puiu TS Evangelist Posts: 3,294   +1,747

    The outrage is simple: very few people want to work for a company that specifically created software for military use that will directly lead to the death of others. Whether or not this technology has civilian use is irrelevant at the current point in time.
    You also know exactly just how secretive the US military operations are. You don't know if the code you wrote is used in morally right, grey or bad situations. "For the country" is just an excuse in this day and age.
    Last edited: Feb 6, 2019
    xxLCxx likes this.
  9. DPennington

    DPennington TS Addict Topic Starter Posts: 88   +32

    Pointing out the controversy that exists regarding the use of drones does not mean I'm personally against the use of them. You're suggesting, then, that the only way to improve the accuracy of drones is to utilize $1 hour gig workers who are unaware that they are being used to enhance a military technology they may morally disagree with? This was not commentary on my opinion of drones.

    For the record, I have no qualms about the military a) using drones and b) making them as accurate and technically advanced as possible. If we can keep soldiers safe by utilizing machines, then by all means do so. The point I was making was regarding the method Google was using to fulfill their obligation to this project.

    So to answer your questions, a militant drone is a drone being utilized by the military. The use of the adjective miliant was not tied to any negative connotation. I do not think drones are "rampaging unattended", nor do I take issue with humans controlling them. And no, I would not like to see drones confusing APCs with ice cream trucks.

    Intellectually-declining Pennington, out.
  10. Cubi Dorf

    Cubi Dorf TS Booster Posts: 127   +49

    Technology may be making world a scary place. However this is future. Even if America doesn’t do, others will eventually
  11. Cubi Dorf

    Cubi Dorf TS Booster Posts: 127   +49

    Google is shady business. I am not knowing about contract law in America to know if they illegaling worker however.
  12. captaincranky

    captaincranky TechSpot Addict Posts: 14,685   +3,836

    First, pay no attention to those assuming the role(s) of SJW in the comments.

    But secondly, the term "militant" is, in the vernacular, normally attached to an activist, combatant, or organization, opposed to a particular issue, government, religion,etc., and advancing against whatever issue in a violent, or conducts themselves in a manner consistent with the threat of violence. Yes, advocacy of violence is enough to classify an individual, group, or organization, as "militant"

    Thus, a "militant drone", would have to "wake itself up in the morning", and autonomously decide to blow up a mosque, ice cream truck, or the poor mailman trudging through 12" of snow, before it could colloquially be considered, "militant".

    "Military" drone(s) accordingly, would more appropriately have attached in all instances of the base adjective's usage.

    CODA: If you lived where I do, you actually wouldn't mind having drones summarily attacking ice cream trucks during the summer. The endless repetitiveness of those damned jingles they play to let you know "we're here", are the equivalent of musical water boarding.
    Last edited: Feb 6, 2019
    DPennington likes this.
  13. DPennington

    DPennington TS Addict Topic Starter Posts: 88   +32

    Fair enough. Bad word choice on my part. Thank you for this clarification.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...