Odyssey iarc-1 Read online




  Odyssey

  ( Isaac Asimov's Robot City - 1 )

  Michael P. Kube-Mcdowell

  Michael P. Kube-McDowell

  Odyssey

  Isaac Asimov’s Robot City. Book 1

  For all the students

  who made my seven years of teaching time

  well spent,

  but especially for:

  Wendy Armstrong, Todd Bontrager, Kathy Branum, Jay amp; Joel Carlin, Valerie Eash, Chris Franko, Judy Fuller, Chris amp; Bryant Hackett, Kean Hankins, Doug Johsnson, Greg LaRue, Julie Merrick, Kendall Miller, Matt Mow, Amy Myers, Khai amp; Vihn Pham, Melanie amp; Laura Schrock, Sally Sibert, Stephanie Smith, Tom Williams, Laura Joyce Yoder, Scott Yoder

  And for

  Joy Von Blon, who made sure they always had something good to read.

  My Robots

  by Isaac Asimov

  I wrote my first robot story, “Robbie,” in May of 1939, when I was only nineteen years old.

  What made it different from robot stories that had been written earlier was that I was determinednot to make my robots symbols. They were not to be symbols of humanity’s over-weening arrogance. They werenot to be examples of human ambitions trespassing on the domain of the Almighty. They werenot to be a new Tower of Babel requiring punishment.

  Nor were the robots to be symbols of minority groups. They werenot to be pathetic creatures that were unfairly persecuted so that I could make Aesopic statements about Jews, Blacks or any other mistreated members of society. Naturally, I was bitterly opposed to such mistreatment and I made that plain in numerous stories and essays-butnot in my robot stories.

  In that case, whatdid I make my robots?-I made them engineering devices. I made them tools. I made them machines to serve human ends. And I made them objects with built-in safety features. In other words, I set it up so that a robotcould not kill his creator, and having outlawed that heavily overused plot, I was free to consider other, more rational consequences.

  Since I began writing my robot stories in 1939, I did not mention computerization in their connection. The electronic computer had not yet been invented and I did not foresee it. I did foresee, however, that the brain had to be electronic in some fashion. However, “electronic” didn’t seem futuristic enough. The positron-a subatomic particle exactly like the electron but of opposite electric charge-had been discovered only four years before I wrote my first robot story. It sounded very science fictional indeed, so I gave my robots “positronic brains” and imagined their thoughts to consist of flashing streams of positrons, coming into existence, then going out of existence almost immediately. These stories that I wrote were therefore called “the positronic robot series,” but there was no greater significance than what I have just described to the use of positrons rather than electrons.

  At first, I did not bother actually systematizing, or putting into words, just what the safeguards were that I imagined to be built into my robots. From the very start, though, since I wasn’t going to have it possible for a robot to kill its creator, I had to stress that robots could not harm human beings; that this was an ingrained part of the makeup of their positronic brains.

  Thus, in the very first printed version of “Robbie” (it appeared in the September 1940Super Science Stories, under the title of “Strange Playfellow”), I had a character refer to a robot as follows: “He just can’t help being faithful and loving and kind. He’s a machine,made so.”

  After writing “Robbie,” which John Campbell, ofAstounding Science Fiction, rejected, I went on to other robot stories which Campbell accepted. On December 23, 1940, I came to him with an idea for a mind-reading robot (which later became “Liar!”) and John was dissatisfied with my explanations of why the robot behaved as it did. He wanted the safeguard specified precisely so that we could understand the robot. Together, then, we worked out what came to be known as the “Three Laws of Robotics.” The concept was mine, for it was obtained out of the stories I had already written, but the actual wording (if I remember correctly) was beaten out then and there by the two of us.

  The Three Laws were logical and made sense. To begin with, there was the question of safety, which had been foremost in my mind when I began to write stories aboutmy robots. What’s more I was aware of the fact that even without actively attempting to do harm, one could quietly, by doing nothing, allow harm to come. What was in my mind was Arthur Hugh Clough’s cynical “The Latest Decalog,” in which the Ten Commandments are rewritten in deeply satirical Machiavellian fashion. The one item most frequently quoted is: “Thou shalt not kill, but needst not strive/Officiously to keep alive.”

  For that reason I insisted that the First Law (safety) had to be in two parts and it came out this way:

  1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

  Having got that out of the way, we had to pass on to the second law (service). Naturally, in giving the robot the built-in necessity to follow orders, you couldn’t forfeit the overall concern of safety. The second law had to read as follows, then:

  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

  And finally, we had to have a third law (prudence). A robot was bound to be an expensive machine and it must not needlessly be damaged or destroyed. Naturally, this must not be used as a way of compromising either safety or service. The Third Law, therefore, had to read as follows:

  3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Laws.

  Of course, these laws are expressed in words, which is an imperfection. In the positronic brain, they are competing positronic potentials that are best expressed in terms of advanced mathematics (which is well beyond my ken, I assure you). However, even so, there are clear ambiguities. What constitutes “harm” to a human being? Must a robot obey orders given it by a child, by a madman, by a malevolent human being? Must a robot give up its own expensive and useful existence to prevent a trivial harm to an unimportant human being? What is trivial and what is unimportant?

  These ambiguities are not shortcomings as far as a writer is concerned. If the Three Laws were perfect and unambiguous there would be no room for stories. It is in the nooks and crannies of the ambiguities that all one’s plots can lodge, and which provide a foundation, if you’ll excuse the pun, forRobot City.

  I did not specifically state the Three Laws in words in “Liar!” which appeared in the May 1941Astounding. I did do so, however, in my next robot story, “Runaround,” which appeared in the March 1942Astounding. In that issue on line seven of page one hundred, I have a character say, “Now, look, let’s start with the three fundamental Rules of Robotics,” and I then quote them. That, incidentally, as far as I or anyone else has been able to tell, represents the first appearance in print of the word “robotics”-which, apparently, I invented.

  Since then, I have never had occasion, over a period of over forty years during which I wrote many stories and novels dealing with robots, to be forced to modify the Three Laws. However, as time passed, and as my robots advanced in complexity and versatility, I did feel that they would have to reach for something still higher. Thus, inRobots and Empire, a novel published by Doubleday in 1985, I talked about the possibility that a sufficiently advanced robot might feel it necessary to consider the prevention of harm to humanity generally as taking precedence over the prevention of harm to an individual. This I called the “Zeroth Law of Robotics,” but I’m still working on that.

  My invention of the Three Laws of Robotics is probably my most important contribution to science fiction. They are widely quoted outside the field, and no history of robotics could possibly be com
plete without mention of the Three Laws. In 1985, John Wiley and Sons published a huge tome,Handbook of Industrial Robotics, edited by Shimon Y. Nof, and, at the editor’s request, I wrote an introduction concerning the Three Laws.

  Now it is understood that science fiction writers generally have created a pool of ideas that form a common stock into which all writers can dip. For that reason, I have never objected to other writers who have used robots that obey the Three Laws. I have, rather, been flattered and, honestly, modern science fictional robots can scarcely appear without those Laws.

  However, I have firmly resisted the actual quotation of the Three Laws by any other writer. Take the Laws for granted, is my attitude in this matter, but don’t recite them. The concepts are everyone’s but the words are mine.

  But, then, I am growing old. I cannot expect to live for very much longer, but I hope that some of my brainchildren can. And to help those brainchildren attain something approaching long life, it is just as well if I relax my rules and allow others to make use of them and reinvigorate them. After all, much has happened in science since my first robot stories were published four decades ago, and this has to be taken into consideration, too.

  Therefore, when Byron Preiss came to me with the notion of setting up a series of novels under the overall title ofRobot City, in which “Asimovian” robots and ideas were to be freely used, I felt drawn to the notion. Byron said that I would serve as a consultant to make sure that my robotsstay “Asimovian,” that I would answer questions, make suggestions, veto infelicities, and provide the basic premise for the series as well as challenges for the authors. (And so it was done. Byron and I sat through a series of breakfasts in which he asked questions and I-and sometimes my wife, Janet, as well-answered, thus initiating some rather interesting discussions.)

  Furthermore, my name was to be used in the title so as to insure the fact that readers would know that the project was developed in conjunction with me, and was carried through with my help and knowledge. It is, indeed, a pleasure to have talented young writers devote their intelligence and ingenuity to the further development of my ideas, doing so each in his or her own way.

  The first novel of the series,Robot City Book 1: Odyssey, is by Michael P. Kube-McDowell, the author ofEmprise, and I am very pleased to be connected with it. The prose is entirely Michael’s; I did none of it. In saying this, I am not trying to disown the novel at all; rather I want to make sure that Michael gets all the credit from those who like the writing. It is my role, as I have indicated, only to supply robotic concepts, answer (as best I can) questions posed by Byron and Michael, and suggest solutions to problems raised by the Three Laws. In fact, Book Two of this series will introduce three interesting new laws concerning the way robots would deal with humans in a robotic society, a relationship which is the underpinning ofRobot City.

  In nearly half a century of writing I have built up a name that is well known and carries weight and I would like to use it to help pave the way for young writers by way of their novels and to preserve the names of older writers by the editing of anthologies. The science fiction field in general and a number of science fiction practitioners in particular have, after all, been very good to me over the years, and the best repayment I can make is to do for others what it and they have done for me.

  Let me emphasize that this is the first time I have allowed others to enter my world of robots and to roam about freely there. I am pleased with what I’ve seen so far, including the captivating artwork of Paul Rivoche, and I look forward to seeing what is done with my ideas and the concepts I have proposed in the books that follow. The books may not be (indeed, are bound not to be) exactly as I would have written them, but all the better. We’ll have other minds and other personalities at work, broadening, raising, and refocusing my ideas.

  For you, the reader, the adventure is about to begin.

  Chapter 1. Awakening

  The youth strapped in the shock couch at the center of the small chamber appeared to be peacefully sleeping. The muscles of his narrow face were relaxed, and his eyes were closed. His head had rolled forward until his chin rested on the burnished metal neck ring of his orange safesuit. With his smooth cheeks and brush-cut sandy blond hair, he looked even younger than he was-young enough to raise the doorman’s eyebrow at the least law-abiding spaceport bar.

  He came to consciousness slowly, as though he had been cheated of sleep and was reluctant to give it up. But as the fog cleared, he had a sudden, terrifying sensation of leaning out over the edge of a cliff.

  His eyes flashed open, and he found himself looking down. The couch into which the five-point harness held him was tipped forward. Without the harness, he would have awakened in a jumbled heap on the tiny patch of sloping floor plate, wedged against the one-ply hatch that faced him.

  He raised his head, and his darting eyes quickly took in the rest of his surroundings. There was little to see. He was alone in the tiny chamber. If he unstrapped himself, there would be room for him to stand up, perhaps to turn around, but nothing more ambitious. A safesuit helmet was cached in a recess on the curving right bulkhead. On the left bulkhead was a dispensary, with its water tube and delivery chute.

  None of what he saw made sense, so he simply continued to catalog it. Above his head, hanging from the ceiling, was some sort of command board with a bank of eight square green lamps labeled “P1,” “P2,” “F,” and the like. The board was in easy reach, except that there appeared to be no switches or controls for him to manipulate. In one corner of the panel the word MASSEY was etched in stylized black letters.

  Apart from the slight rasp of his own breathing, the little room was nearly silent. From the machinery which filled the space behind his shoulders and under his feet came the whir of an impeller and a faint electric hum. But there was no sound from outside, from beyond the walls.

  Thin as it was, the catalog was complete, and it was time to try to make something of it. He realized that, although he did not recognize his surroundings, he was not surprised by them. But then, since he could not remember where he had fallen asleep, he had carried no expectations about where he should be when he awoke.

  The simple truth was he did not know where he was. Or why he was there. He did not know how long he had been there, or how he had gotten there.

  But at the moment none of those things seemed to matter, for he realized-with rapidly growing dismay and disquiet-that he also did not know who he was.

  He searched his mind for any hint of his identity-of a place he had known, of a face that was important to him, of a memory that he treasured. There was nothing. It was as though he was trying to read a blank piece of paper. He could not remember a single event which had taken place before he had opened his eyes and found himself here. It was as though his life had begun at that moment.

  Except he knew that it had not. He was nota crying newborn child, but a man-or near enough to one to claim the title until challenged. He had existed. He had had an identity and a place in the world. He had had friends-parents-a home. He had to have had all of that and more.

  But it was gone.

  It was a different feeling than merely forgetting.At least when you forget something, you have a sense that you once knew it-

  “Are you all right?” a pleasant voice inquired, breaking the silence and making him suddenly tense all his muscles.

  “Who are you?” he demanded. “Where are you? Where am I?”

  “I am Darla, your Companion. Please try to remain calm. We’re in no immediate danger.” The voice, coming from the command panel before him, was more clearly female now. “You are inside a Massey Corporation Model G-85 Lifepod. Massey has been the leader in spacesafety systems for more than… ”

  While Darla continued on with her advertisement, he twisted his head about as he reexamined the compartment. I should have known that, he thought. Of course. A survival pod. Even the nameMassey was familiar. “Why are there no controls?”

  “All G-series pods have bee
n designed to independently evaluate the most productive strategy and respond appropriately.”

  Of course, he thought. You don’t know who’s going to climb into a pod, or what kind of condition they’ll be in. “You’re not a person. What are you, then? A computer program?”

  “I am a positronic personality,” Darla said cheerfully. “The Companion concept is the Massey Corporation’s unique contribution to humane safety systems.”

  Yes. Someone to talk to. Someone to help him pass hours of waiting without thinking about what it would mean if he weren’t found. The full picture dawned on him. All survival pods were highly automated. This one was more. It was a robot-presumably programmed as a therapist and charged with keeping him sane and stable.

  A robot-

  A human had a childhood. A robot did not. A human learned. A robot was programmed. A robot deprived of the core identity which was supposed to be integrated before activation might “awake” and find he had knowledge without experience, and wonder who and what he was-

  Suddenly he bit down on his lower lip.

  How does a robot experience sensor overload? As pain?

  When he tasted blood, he relaxed his jaw. He would take the outcome of his little experiment at face value. He was human. In some ways, that was the more disturbing answer.

  “Why have you done harm to yourself?” Darla intruded.

  He sighed. “Just to be sure I could. Do you know whoI am?”

  “Your badge identifies you as Derec.”

  He looked down past the neck ring and saw for the first time that there was a datastrip in the badge holder on the right breast of the safesuit. The red printing, superimposed on the fractured black-and-white coding pattern, indeed read DEREC.

  He said the name aloud, experimentally: “Derec.” It seemed neither familiar nor foreign to his tongue. His ear heard it as a first name, even though it was more likely a surname.