- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
It’s also common knowledge that Asimov’s stories often served as "proving grounds" for the laws (there’s a longish and reasonably geeky discussion of the three laws at Wikipedia that examines these trials of the laws). He experimented with various augmentations of and alterations to the structure of the laws over the years.
It’s a safe bet, though, that he never envisioned any laws like the ones put forward by the Something Awful crew today.
9. A robot must stop visiting Isaac Asimov’s bedroom at night and fabricating situations that would make it appear that the sleeping
Asimov has less than total control of his urinary faculties.
10. A robot, when given contradictory orders by two human beings, and assuming those orders do not violate the First Law, must decide which order to follow based on which human being has a deeper voice.
23. A robot must shut up around girls and let me, Isaac Asimov, do the talking; however, a robot may bail me out if things start to go haywire.
30. A robot may not change the channel, or by omission of action allow the channel to be changed, during a Niners game.
In all, the SA goons have come up with not three, but thirty laws of robotics. Asimov’s spinning, but frankly, my sides are splitting. Anyone up for a competition? Let’s hear it: What other laws of robotics did Asimov "forget"?