Oh, The Humanity!
(or lack thereof)
The Original & The Inevitable Knockoff
During the early eras of science fiction, one favorite stock plot was the
Robot Run Amok story. The usual
thrust of the story was something like this:
Science creates
robots, robots work like
slaves, robots
resent slavery,
robots revolt, robots kill, hero stops robots, saves day, gets girl, "
There Are Some
Things Man Is Not Meant To Know", the end. In fact, this
archetypical plot can be seen (with minor
variations) through out
mythology, if one allows
golems,
summoned demons, and the Frankenstein monster
to be substituted in for robots.
In 1939, Isaac Asimov wrote the short story "Robbie" (originally published with the cringe-worthy
title "Strange Playfellow") about a young girl and her love for her robot nanny. An important point
in the story is that the robot could not harm a human. In this early story, the form of the safety
mechanism was not made explicit. It was, however, mentioned that before a robot could harm a human,
it would have to be so damaged that it would not function at all.
A short time later, Asimov wrote another robot story, called "Runaround", in which a malfunctioning
robot endangers two humans failing to return with some much needed selenium to repair the Mercury
base. It is in this story that he lays out the Three Laws of Robotics in their classic form:
- A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with
the First Law.
- A robot must protect its own existence, as long as such protection does not
conflict with the First or Second Law.
In the story, the Third Law had been set up unusually strong to
protect the expensive, experimental robot and the Second law had been set up unusually weak by phrasing
the instructions as a casual request rather then a direct order. The unusual balance between the
Second Law and Third Laws caused the robot to circle the dangerous area endlessly. Invoking the First Law
by placing a human in direct danger broke the conflict. (Incidentally, Oxford English Dictionary cites
this story as the first use of the word "robotics".)
Asimov wrote many stories about his robots, including at least 4 novels and countless short
stories. These stories were collected into numerous volumes including I, Robot, The Complete Robot,
and Robot Visions among others. The book I, Robot is a short story anthology bound together by a
framing story interlaced between the others. In the 1970s Warner Brothers optioned the movie rights
to I, Robot and Harlan Ellison collaborated with Asimov to write a screenplay. This screenplay was
never produced. The recent movie I, Robot was originally an unrelated project that was adapted
partially to Asimov's stories as a marketing ploy.
Characters
A number of characters in the movie were borrowed from the original stories, but all were
significantly changed. Susan Calvin in the stories is a cold, unfeminine woman utterly devoted to the
robots. The Susan Calvin in the movie is a disconnected nerd utterly devoted to the company. Further,
the book version of Dr. Calvin is always the prime character in any story she appears in. In the
movie, she plays second banana to Will Smith's character, detective Del Spooner.
Eddie is a positronic super brain who appears in the story "Escape!". In that story, he performs
difficult calculations for U.S. Robots and Mechanical Men. His analog in the movie is V.I.K.I., a
positronic super brain who runs the U.S. Robotics headquarters building and computer services.
Alfred Lanning and Lawrence Robertson are two other characters from the book that appear in the movie.
In both, these characters fill supporting roles.
Parallels
The much of the overall plot of the movie is borrowed from the story "The Evitable Conflict". In this
story, the World Co-ordinator is investigating a series of errors made by the Machines, a set
of super intelligent positronic brains that are effectively running the world's economy. It is
uncovered that it was technically impossible for the Machines to make such errors, either by
internal fault or by incorrect data. Furthermore, each error caused a human to come to harm.
Although it was always an extremely minor harm (i.e. one's business went bankrupt but he
immediately found another job), the Laws should still have prevented the Machines from
deliberately causing it. It turns out that the Machines were interpreting the first law as:
A robot may not harm humanity, or, through inaction, allow humanity to come to harm.
and their actions were motivated by the ultimate good of humanity. (This version also appears in the
novel
Robots and Empire as the
Zeroth Law of Robotics.)
In the movie, the V.I.K.I and the NS-5s she has reprogrammed are working with a similarly
distorted First Law. In this case, however, it is expressed violently. The NS-5s run amok,
destroying older model robots and killing any uncooperative humans. V.I.K.I. is using them to
take over the world and rule it for our own good. In the process, she is causing ultimate harm
to many humans, in direct violation of the First Law.
The movie also contains references to the story "Little Lost Robot". In this story, the
government is using special versions of the otherwise normal NS-2. These special robots have only the
positive portion of the First Law. The modified law reads:
A robot may not harm a human being
The problem is that one of the robots has gone missing after an enraged human ordered it to "
go lose
yourself", the robot hid among sixty-two normal NS-2s intended for delivery elsewhere. Susan Calvin
is called in to help identify the proper robot. She was enraged that such a dangerous robot was
created without consulting her. Her first instinct was to destroy all sixty-three,
just to be on the
safe side. When her plan is
vetoed, they proceed to interview and test all the robots in an attempt to
find the missing one. At each turn, the robot finds a way to stay hidden and gradually gets more and
more unstable. He is finally outsmarted by Susan Calvin and destroyed.
In the movie, the robots that run amok are the NS-5s. One prototype, named Sonny, is suspected
of having killed a human and has been observed not to obey orders. At one point he hides
himself among one thousand similar NS-5s. Susan Calvin recommends interviewing all of them,
taking a period of several weeks. Detective Del Spooner, however, finds a slightly more
direct way to locate the 1001st robot. First he orders the robots not to move, invoking the
Second Law. Then he proceeds to shoot several of them in the head invoking the Third Law. The
robots, acting according to the laws, remain stationary, but the rogue tries to escape.
Later in the movie, Sonny in found to have been modified from a standard NS-5. In addition to
other changes, Sonny is not compelled to follow the three laws. Rather than horror and outrage
that such a dangerous robot had been created, Susan Calvin was curious and fascinated. In
fact, she decided not to destroy Sonny because he was "too unique". This is a decided
difference from the story, where that Susan Calvin was willing to destroy sixty-two other
robots just to be certain that they got the dangerous one.
In the original story, the rogue NS-2 is portrayed as a dangerous unstable being. Besides his
stunted First Law, he had a superiority complex and wounded pride. During the last moments of his
life, he was trying to overcome the last remnants of the first law to kill Susan Calvin. In human
terms, he was a psychopath. In the movie, Sonny was portrayed unique and sympathetic. His creator
gave him dreams and emotions. In the end, only Sonny of all the NS-5s could realize that V.I.K.I.'s
plan, although logical, was heartless. He could realize this only because his thought processes were
not strictly bound by the Three Laws. He was one of the heroes of the movie.
There is another major difference in theme between the movie and the short
stories. The public's fear of robots is a major theme throughout Asimov's early robot stories. This
is not so easily seen in the I, Robot collection, as most of those stories are set off earth. Two of
the stories, however, feature this prejudice. "Evidence" and "The Evitable Conflict" both refer to
"the Society for Humanity". This Society is devoted to opposing the use of robots.
Asimov stories from other collections also expand upon this theme. In "Lenny" members of the public
touring the U.S. Robots factory shy in fear from the MEC model robots, which are designed to shake
hands and nothing else. In "Galley Slave" a proofreading robot is rented to a university at a
ridiculously cheap rate in an effort to gradually introduce robots on Earth. Later in the same
story, a point is made about how the anti-robot laws make it difficult to transport a robot needed as
evidence in a trial. In "Feminine Intuition", the board of directors resists the idea of making an
intuitive robot, out of concern that the public will fear it.
In the movie, this situation is reversed. The public and the authorities trust robots. Only Detective
Del Spooner does not. The Detective's superiors tell him he is being ridiculous after he runs down a
robotic "purse snatcher" that was actually delivering said purse to an asthmatic old lady. Also, that
old lady called the detective a fool. Even his own grandmother told him that he "ought to know better"
than to distrust robots. The movie depicts robots performing tasks in public without supervision (for
example, the FedEx robot at the beginning of the movie).
The movie was written as a classic Robots Run Amok story and later adapted to "fit"
Asimov's work. The adaptation seems to have consisted of little more than the insertion of
references to the Three Laws and some scenes from Asimov's stories. The overall idea of his
work -- that engineers will take reasonable precautions to make robots safe -- was completely
ignored. Instead the thrust seemed to be that giving robots safety failsafes will actually
make them more dangerous and that a safe robot is one with emotions and dreams. It is
interesting to note that the movie was not made until well after Asimov's death. In essence,
it was made over his dead body. The movie itself, as a sci-fi and action movie, is not bad. If
it had not been marketed as I, Robot it would have been much better. As it was, the movie was
a poor parody at best.
Asimov realized an important truth: if we create robots, we will take reasonable precautions
to ensure the safety of the people who work with them. In his 1979 essay, "The Laws of Robotics",
Asimov points out that Laws of robotics are also the Laws Of Tools.
- A tool must be safe to use.
- A tool must perform its function, provided it does so safely.
- A tool must remain intact during use unless its destruction is required for safety or unless its destruction is part of its
function
For the same reason that houses have
fuses and cars have
airbags and
crumple zones, robots
would have the Three Laws or something like them.
References
Asimov, Issac. I, Robot Publisher Unknown, 1950
Asimov, Issac. Robot Visions New York:Penguin books USA, 1991
"I, Robot" Wikipedia. accessed: 12/1/04.< http://en.wikipedia.orgwiki/I%2C_Robot>.
Movie. I, Robot. 1994