If you're a fan of science fiction, you'll instantly recognize this line: "I cannot harm, or by mission of action be allowed to harm, a human being." A modified version of one of the "three rules of robotics" in which conscious machines could never be used against humans is now being found in a GPL license. A new piece of software called "GPU" that seeks to create a P2P network of machines sharing resources is being developed, putting supercomputing power into the hands of regular Joes. In creating in, the developers decided they quite literally didn't want the military to use this software, and created an altered GPL licenses explicitly for that scenario:

Tiziano Mengotti and Rene Tegel are the lead developers on the GPU project. Mengotti is the driving force behind the license "patch," which says "the program and its derivative work will neither be modified or executed to harm any human being nor through inaction permit any human being to be harmed."
They have very specific reasons for wanting this, and explain such in the article:

Mengotti says the clause is specifically intended to prevent military use. "We are software developers who dedicate part of our free time to open source development. The fact is that open source is used by the military industry. Open source operating systems can steer warplanes and rockets. [This] patch should make clear to users of the software that this is definitely not allowed by the licenser."
A noble cause, maybe, but just how do the developers intend to enforce this portion of the license? Odds are if a piece of military hardware is operating on OSS software, you'll probably never know. It's a short read, but an interesting one. Take a look.