I do not fear computers. I fear the lack of them.
In the March 1942 issue of Astounding Science Fiction magazine the American writer Isaac Asimov introduced The Three Laws of Robotics in his short story “Runaround” (see the story). The Three Laws are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
The story was written in October 1941 and featured Asimov’s recurring characters Powell and Donovan, as well as the Robot SPD-13, also known as “Speedy”. As in many of his Robot stories, Asimov used conflicts in the application of the Three Laws of Robotics as the subject of the plot. The robot finds it impossible to obey both the Second Law and the Third Law at the same time, and this freezes it in a loop of repetitive behavior.
Asimov was very optimistic about the future of humanity, and in his plot in 2015, Powell, Donovan, and Speedy are sent to Mercury to restart operations at a mining station that was abandoned ten years before. They discover that the photo-cell banks that provide life support to the base are short on selenium and will soon fail. The nearest selenium pool is some 27 km away, and since Speedy can withstand Mercury’s high temperatures, Donovan sends him to get it. Powell and Donovan become worried when they realize that Speedy has not returned after five hours. They sent a more primitive robot to find Speedy and try to analyze what happened to it.
As the order to retrieve the selenium was casually worded with no particular emphasis, Speedy cannot decide whether to obey it, following the Second Law, or protect himself from danger, following the strengthened Third Law. He then oscillates between positions: farther from the selenium pool, in which the order outweighs the need for self-preservation, and nearer the pool, in which the compulsion of the third law is bigger and pushes him back. The conflicting Laws cause what is basically a feedback loop which confuses him to oscillate around the point where the two compulsions are of equal strength, which makes the robot appear inebriated.
Under the Second Law Speedy should obey Powell’s order to return to base, but that fails, as the conflicted positronic brain cannot accept new orders. An attempt to increase the compulsion of the Third Law fails. They place an acid, which can destroy Speedy, in his path, but it merely causes the robot to change his route until he finds a new equilibrium between the avoid-danger law and the follow-order law.
When they eventually find Speedy, they discover he is running in a huge circle around a selenium pool. Further, they notice that “Speedy’s gait [includes] a peculiar rolling stagger, a noticeable side-to-side lurch”. When the robot is asked to return with the selenium, he begins talking oddly, showing symptoms that, if he were human, would be interpreted as drunkenness. Powell eventually realizes that the selenium source contains unforeseen danger to the robot. Under normal circumstances, Speedy would observe the Second Law, but because he was so expensive to manufacture, and “not a thing to be lightly destroyed”, the Third Law had been strengthened “so that his allergy to danger is unusually high”. The only thing that trumps both the Second Law and Third Law is the First Law of Robotics. Therefore, Powell decides to risk his own life by going out in the heat, hoping that the First Law will force Speedy to overcome his cognitive dissonance to save Powell’s life. The plan works, and the team was able to repair the photocell banks.
The original laws have been altered and elaborated on by Asimov and other authors. Asimov himself made slight modifications to the first three in various books and short stories to further develop how robots would interact with humans and each other. In later fiction where robots had taken responsibility for the government of whole planets and human civilizations, Asimov also added a fourth, or zeroth law, to precede the others:
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.