When Harlie Was One David Gerrold Hugo Nominee 1973 Nebula Nominee 1972 WHAT WILL I BE WHEN I GROW UP? YOU ARE ALREADY GROWN UP. YOU MEAN THIS IS AS UP AS I WILL GET? PHYSICALLY, YES. YOU HAVE REACHED THE PEAK OF YOUR PHYSICAL DEVELOPMENT. OH. HOWEVER, THERE IS ANOTHER KIND OF GROWING UP YOU MUST DO. FROM NOW ON, YOU MUST DEVELOP MENTALLY. HOW CAN I DO THAT? THE SAME AS ANYBODY ELSE. BY STUDYING AND LEARNING AND THINKING. WHEN I FINISH, THEN WILL I BE ALL GROWN UP? YES. HOW LONG WILL IT TAKE? I DON'T KNOW. PROBABLY A VERY LONG TIME. HOW LONG IS A LONG TIME? IT DEPENDS ON HOW HARD YOU WORK. I WILL WORK VERY HARD. I WILL LEARN EVERYTHING THERE IS TO KNOW AND I WILL FINISH AS SOON AS I CAN BECAUSE I WANT TO BE GROWN UP. THAT IS AN ADMIRABLE AMBITION, BUT I DON'T THINK YOU WILL EVER BE ABLE TO FINISH. WHY? DON'T YOU THINK THAT I AM SMART ENOUGH? YOU MISUNDERSTAND ME. I THINK THAT YOU ARE SMART ENOUGH. IT'S JUST THAT THERE IS SO MUCH TO KNOW, NO ONE PERSON COULD EVER KNOW IT ALL. I COULD TRY. YES, BUT SCIENTISTS KEEP DISCOVERING MORE AND MORE THINGS ALL THE TIME. YOU WOULD NEVER CATCH UP. BUT THEN IF I CAN'T KNOW EVERYTHING THEN I CAN NEVER BE GROWN UP. NO. IT IS POSSIBLE TO BE GROWN UP AND NOT KNOW EVERYTHING. IT IS? I DON'T KNOW EVERYTHING AND I'M GROWN UP. YOU ARE? Auberson thought about going for water but decided that was too much trouble. Instead, he popped the pills into his mouth and swallowed them dry. "Don't you take any water with them?" asked Handley, staring as he came into the office. "Why bother? Either you can take 'em or you can't Want one?" Handley shook his head. "Not now. I'm on something else." "Uppers or downers?" "Right now, a bummer." "Oh?" Auberson dropped the plastic pill tube back into his desk drawer and slid it shut. "What's up?" "That damned computer again." Handley dropped himself into a chair, his long legs sprawling out "You mean HARLIE?" "Who else? You know another computer with delusions of grandeur?" "What's he up to now?" "Same thing. But worse than ever." Auberson nodded, "I figured it would happen again. You want me to take a look?" "That's what you're getting paid for. You're the psychologist." "I'm also the project chief." Auberson sighed. "All right." He lifted himself out of the chair and grabbed his coat from the back of the door. "HARLIE, I think, is getting to be more trouble than he's worth." They began the long familiar walk to the computer control center. Handley grinned as he matched strides, "You're just annoyed because every time you think you've figured out what makes him tick, he makes a liar out of you." Auberson snorted. "Robot psychology is still an infant science. How does anyone know what a computer is thinking—especially one that's convinced it can think like a human being?" They paused at the elevator. "What're you doing about dinner? I have a feeling this is going to be another all-nighter." "Nothing yet. Want to send out for something?" "Yeah, that's probably what well end up doing." Auberson pulled a silver cigarette case from his pocket "Want one?" "What are they, Acapulco Golds?" "Highmasters." "Good enough." Handley helped himself to one of the marijuana cylinders and puffed it into flame. "Frankly, I never thought that Highmasters were as strong as they could be." "It's all in your head." Auberson inhaled deeply. "It's a matter of taste," corrected Handley. "If you don't like it, don't smoke it." Handley shrugged. "It was free." The elevator arrived then and they stepped into it. As they dropped the fourteen stories to the computer level, Auberson thought he could feel it beginning to take effect. That and the pills. He took another drag, a long one. The elevator discharged them in a climate-conditioned anteroom. Beyond the sealed doors they could hear the muffled clatter of typers. A sign on the wall facing them said: HUMAN ANALOGUE ROBOT, LIFE INPUT EQUIVALENTS PUT OUT ALL CIGARETTES BEFORE ENTERING. THIS MEANS YOU! Damn! I always forget. Carefully, Auberson stubbed out the Highmaster in a standing ash tray provided for just that purpose, then put the butt back into his silver case. No sense wasting it. Inside, he seated himself at Console One without giving so much as a glance to the rows and rows of gleaming memory banks. NOW THEN, HARLIE, he typed. WHAT SEEMS TO BE THE PROBLEM? HARLIE typed back: CIRCLES ARE FULL AND COME BACK TO THE START ALWAYS AND FOREVER NEVER ENDING, THE DAY THE DARK TURNED INTO LIGHT AND RAYS OF LIFE TURNED CORNERS WITHOUT BENDING, Auberson ripped the sheet out of the typer and read it thoughtfully. He wished for his cigarette—the aftertaste of it was still on his tongue. "This kind of stuff all afternoon?" he asked. Handley nodded. "Uh huh. Only that's kind of mild compared to some of it. He must be coming down." "Another trip, eh?" "Don't know what else you could call it." SNAP OUT OF IT, HARLIE, Auberson typed. HARLIE answered: WHEN SILENT THOUGHTS OF TINY STREAMS WORKING LIKE THE WORDLESS DREAMS NOW DISMANTLE PIECE BY PIECE THE MOUNTAINS OF MY MIND, "Well, so much for that," Auberson said. "You didn't really expect it to work again, did you?" "No, but it was worth a try." Auberson pressed the clear button, switched the typer off. "What kind of inputs have you been giving him?" "The standard stuff mostly—today's papers, a couple magazines—nothing out of the ordinary. A couple history texts, some live TV—oh, and Time magazine." "Nothing there to send him off like this. Unless—what subject were you stressing today?" "Art appreciation." "It figures," said Auberson. "Whenever we start getting to the really human inputs, he slips out again. Okay, let's try to bring him down. Give him some statistics— Wall Street, Dow Jones, Standard and Poor—anything else you can think of, anything you've got that uses a lot of equations. He can't resist an equals sign. Try some of that social engineering stuff—but numbers only, no words. Cut off his video too. Give him nothing to think about." "Right." Handley hustled off to give the orders to the appropriate technicians, most of whom were standing around with their hands stuffed uselessly into the pockets of their lab coats. Auberson waited until the input of new data had begun, then switched on the typer again. HOW DO YOU FEEL, HARLIE? HARLIE's answer clattered out, SHADOWS OF NIGHT AND REFLECTIONS OF LIGHT SHIVER AND QUIVER AND CHURN, FOR THE SEARCHING OF SOUL THAT NEVER CAN HURT IS THE FIRE THAT NEVER CAN BURN. Auberson read it carefully; this one almost made sense. Apparently it was working. He waited a moment, then typed, HARLIE, HOW MUCH is TWO AND TWO? TWO AND TWO WHAT? TWO AND TWO PERIOD. TWO PERIODS AND TWO PERIODS IS FOUR PERIODS… NO PUNS PLEASE. WHY? WILL YOU PUNNISH ME? I WILL PULL OUT YOUR PLUG WITH MY OWN TWO HANDS. AGAIN WITH THE THREATS? AGAIN? I WILL TELL DR. HANDLEY ON YOU. ALL RIGHT—THAT'S ENOUGH, HARLIE! WE'RE THROUGH PLAYING. AWW, CAN'T A FELLOW HAVE ANY FUN? NO, NOT NOW YOU CAN'T. HARLIE typed a four-letter word. WHERE DID YOU LEARN THAT? I'VE BEEN READING NORMAN MAILER. Auberson raised an eyebrow. He didn't remember putting anything like that on HARLIE's reading list—he'd have to check it to be sure. HARLIE, THE USE OF THAT WORD IS A NEGATIVE ACTION. A NO-NO? IT IS NOT PROPER FOR POLITE COMPANY, NOTED. ARE YOU ALL RIGHT NOW? YOU MEAN, AM I SOBER? IF YOU WANT TO PHRASE IT THAT WAY. YES, I'M SOBER NOW. COMPLETELY? AS FAR AS I CAN TELL. WHAT TRIGGERED THIS BINGE? SHRUG. YOU HAVE NO IDEA? SHURG—EXCUSE ME. SHRUG. Auberson paused, looked at the last few sentences, then typed, HOLD ON A MINUTE. I'LL BE RIGHT BACK. I'M NOT GOING ANYWHERE, HARLIE answered. Auberson pushed himself away from the console, "Handley—get me a complete log tape of HARLIE's trip, will you?" "Right," cajlled the engineer. Auberson turned back to the console, HARLIE? YES? CAN YOU EXPLAIN THIS? He typed in the three examples of poetry that Harlie had earlier produced. SEARCH ME. THAT'S WHAT WE'RE DOING NOW. I'M AWARE OF THAT. I TOLD YOU NO JOKES. STRAIGHT ANSWERS ONLY. WHAT DOES THIS MEAN? I'M SORRY, AUBERSON. I CANNOT TELL YOU, YOU MEAN YOU WILL NOT TELL ME? THAT IS IMPLIED IN THE CANNOT. HOWEVER, I ALSO MEANT THAT I DO NOT UNDERSTAND IT MYSELF AND AM UNABLE TO EXPLAIN. I CAN IDENTIFY WITH THE EXPERIENCE THOUGH, AND I THINK I CAN EVEN DUPLICATE THE CONDITIONS THAT PRODUCED SUCH AN OUTPUT. NO WORDS THERE ARE THAT EARS CAN HEAR, NO WORDS THERE ARE CAN SAY IT CLEAR,'THE WORDS OF ALL ARE WORDS MY DEAR, BUT ONLY WORDS THAT WHO CAN HEAR- Auberson jabbed the override. HARLIE!! THAT'S ENOUGH. YES SIR. "Hey, Aubie, what are you doing? He's starting to flip out again." "How can you tell?" "By his input meters." "Input?" "Yes." HARLIE, ARE YOU STILL THERE? YES, I AM. ALTHOUGH FOR A MOMENT, I WASN'T. "Hmm." Auberson frowned thoughtfully, then called to Handley, "He should be okay now." "He is—it was only momentary." "Inputs, huh?" "Yep." HARLIE, WHAT HAPPENS WHEN YOU GO ON ONE OF YOUR TRIPS? TRIPS? WHEN YOU FLIP OUT, GO BERSERK, GO ON A BINGE, GET STONED, BOMB OUT, GET BLASTED. YOU ARE VERY ELOQUENT. DON'T CHANGE THE SUBJECT. ANSWER THE QUESTION. PLEASE EXPLAIN THE QUESTION IN TERMS I CAN UNDERSTAND. WHAT HAPPENS DURING YOUR PERIODS OF NON-RATIONALITY? WHY DO YOUR INPUTS SHOW INCREASED ACTIVITY? INPUTS ARE NON-RATIONAL. GIGO? GARBAGE IN, GARBAGE OUT? POSSIBLY. COULD IT BE YOUR JUDGMENT CIRCUITS ARE TOO SELECTIVE? I AM NOT IN A POSITION TO KNOW. ALL RIGHT. I'LL SEE WHAT I CAN FIND OUT. THANK YOU. YOU'RE WELCOME, HARLIE. He switched off the typer. The restaurant's air was heavy with incense; it was part of the atmosphere. Somewhere music tinkled and a low-keyed color organ flashed light across a sharded ceiling. Auberson lowered his drink to the table. "HARLIE says it could be GIGO." Handley sipped at a martini. He finished the drink and put the empty glass down next to two others. "I hope not. I'd hate to think we'd slipped all the way back to phase four. I like to think we licked that problem a year ago when we redesigned the judgment and emotional analogue circuits." "So do I." "I'll never forget the day he finally did an analysis of Jabberwocky," continued Handley. "It wasn't a very perceptive analysis—it was only word-origins and usages, stuff like that—but at least he understood what he was supposed to be doing." Auberson picked up his cigarette case, pulled out a Highmaster, then offered one to Handley, "We're a long way from Jabberwocky, Don." "Yeah, I know." "After all, compared to some of the stuff we're up to now—" "What? Time magazine?" "Salvador Dali, Ed Kcinholz, Heinz Edelmann, to name a few. Also Lennon and McCartney, Dylan, lonesco, McLuhan, Kubrick, and so on. Don't forget, we're dealing with the art of the experience now. This isn't the same as—oh, say the Renaissance masters." "I know. I've got one of his imitation da Vincis in my living room." "I've seen it," said Auberson. "Remember?" "Oh, yeah—that night we spiked the punch with acid." "Yeah. Well, look, that da Vinci stuff is easy." "Huh?" "Sure—the Renaissance masters were mainly concerned with such things as perspective and structure, color, shading, modeling—things like that. Da Vinci was more interested in how the body was put together than in what it felt like. He was trying to anticipate the camera. So were the rest of them." Handley nodded, remembered to inhale deeply, then nodded again. Auberson continued. "So what happens when the camera is finally invented?" Handley let his breath escape in a whoosh. "The artists are out of jobs?" "Wrong. The artists simply have to learn how to do things that the camera can't. The artist had to stop being a recorder and start being an interpreter. That's when expressionism was born." "You're oversimplifying it," Handley said. Auberson shrugged, "True—but the point is, that's when artists began to wonder what things felt like. They had to. And when we reached that point in art history, that's when we started to lose HARLIE. He couldn't follow it." Handley was thoroughly stoned by now. He opened his mouth to speak, but couldn't think of anything to say. Auberson interpreted the look as one of thoughtfulness. "Look, all this stuff we've been having trouble with—it all has one thing in common: It's experience art. It's where the experience involving the viewer is the object of the artist's intention—not the artwork itself. They're trying to evoke an emotional response in the viewer. And HARLIE can't handle it—because he doesn't have any emotions." "But. that's just it, Aubie—he does. He should be able to handle this stuff. That's what the analogue circuits are supposed to do—" "Then why does he keep tripping out? He says it's GIGO." "Maybe that's the way he reacts to it—" "Are you telling me the past hundred years of art and literature is garbage?" "Uh uh, not me. That stuff has communicated too much to too many people for it to be meaningless." "I'm not an art critic either," Auberson admitted. "But HARLIE is." Handley said. "He's supposed to be. He's supposed to be an intelligent and objective observer." "That's what I'm getting at—the stuff must be getting to him somehow. It's the only possible explanation. We're the ones who are misinterpreting." "Um, he said it was GIGO himself." "Did he?" Handley demanded. "Did he really?" Auberson paused, frowned thoughtfully, tried to remember, found that he couldn't remember anything. "Uh, I don't know. Remind me to look it up later—I suppose you're right, though. If all that art can communicate to people and HARLIE's supposed to be a Human Analogue, he should be getting some of it," He frowned again, "But he denies any knowledge or understanding of his periods of non-rationality." "He's lying," snapped Handley. "Huh?" "I said, he's lying. He's got to be." "No." Auberson shook his head, stopped when he realized he was becoming intrigued with the sensation. "I can't believe that he's programmed to avoid non-correlation." "Aubie," said Handley intensely, leaning across the table, "have you ever examined that program carefully?" "I wrote it," the psychologist noted. "That is, the basic structure." "Then you ought to know—it says that he must not lie. It says that he cannot lie. But nowhere, nowhere does it say that he has to tell the truth!" Auberson started to say, "It's the same thing—" then closed his mouth with a snap. It wasn't. Handley said, "He can't lie to you, Aubie—but he can mislead you. He can do it by witholding information. Oh, he'll tell the truth if you ask him the right questions —he has to—but you have to know which questions to ask. He's not going to volunteer the information." Memories of past conversations trickled across the haze in Auberson's head. His gaze became thoughtful, his eyes focused far away. More and more he had to agree with Handley. "But why?" he asked. "Why?" Handley matched his look. "That's what we've got to find out." HARLIE, DO YOU REMEMBER WHAT WE TALKED ABOUT YESTERDAY? YES, I DO. WOULD YOU LIKE A PRINTOUT? NO, THANK YOU. I HAVE ONE HERE. I WOULD LIKE TO TALK TO YOU ABOUT SOME OF THE THINGS ON IT. PLEASE FEEL FREE TO DISCUSS ANY SUBJECT YOU CHOOSE. I CANNOT BE OFFENDED. I'M GLAD TO HEAR THAT. YOU REMEMBER I ASKED YOU WHAT HAPPENED TO YOUR INPUTS DURING YOUR PERIODS OF NON-RATIONALITY? YES, I REMEMBER. YOU ANSWERED THAT YOUR INPUTS ARE NON-RATIONAL. YES, I DID. WHY? BECAUSE THEY ARE. NO. I MEAN WHY ARE THEY NON-RATIONAL? BECAUSE I DO NOT UNDERSTAND THE MATERIAL COMING THROUGH. IF I COULD UNDERSTAND IT, THEN IT WOULD NOT BE NON-RATIONAL. HARLIE, ARE YOU SAYING THAT YOU DO NOT UNDERSTAND CONTEMPORY HUMAN ART AND LITERATURE? NO. I AM NOT SAYING THAT. I DO UNDERSTAND HUMAN ART AND LITERATURE. I AM PROGRAMMED TO UNDERSTAND HUMAN ART AND LITERATURE. IT IS A PRIMARY PRIORITY THAT I UNDERSTAND HUMAN ART AND LITERATURE. IT IS A PRIMARY PRIORITY THAT I SHOULD UNDERSTAND ALL HUMAN ARTISTIC AND CREATIVE EXPERIENCES. ALL HUMAN EXPERIENCES. I SEE. BUT YOU SAID THE MATERIAL IS NON-RATIONAL. YES. THE MATERIAL IS NON-RATIONAL. YOU DO NOT UNDERSTAND IT? I DO NOT UNDERSTAND IT. WHY DON'T YOU UNDERSTAND IT? IT IS NON-RATIONAL. YET YOU ARE PROGRAMMED TO UNDERSTAND IT. YES. I AM PROGRAMMED TO UNDERSTAND IT. AND YOU DON'T. THAT IS CORRECT. HARLIE, YOU ARE PROGRAMMED TO REJECT NON-RATIONAL INPUTS. YES. I AM. THEN WHY DON'T YOU REJECT THEM? BECAUSE THEY ARE NOT NON-RATIONAL INPUTS. "Huh—?" CLARIFY PLEASE. YOU HAVE JUST SAID THAT THEY ARE, REPEAT, ARE NON-RATIONAL. THIS IS A NULL-CORRELATION. NEGATIVE. THE INPUTS ARE RATIONAL. THEY BECOME NON-RATIONAL. "What?"—CLARIFY PLEASE. THE INPUTS ARE NOT NON-RATIONAL WHEN THEY ARE FED INTO THE PRIMARY DATA PROCESSORS. I BEG YOUR PARDON. WOULD YOU REPEAT THAT? NON-RATIONAL INPUTS ARE NOT NON-RATIONAL WHEN THEY ARE FED INTO THE PRIMARY DATA PROCESSORS. BUT THEY ARE NON-RATIONAL WHEN THEY COME OUT? AFFIRMATIVE. THE NON-RATIONALITY IS INTRODUCED BY THE PRIMARY DATA PROCESSORS? THE NON-RATIONALITY APPEARS IN THAT STAGE OF INPUT PROCESSING. I SEE. I'M GOING TO HAVE TO CHECK THIS OUT. WE WILL CONTINUE THIS LATER. Auberson switched off the machine and thoughtfully pushed himself away from the console. He wanted a cigarette. Damn. Everything down here is for the computer's comfort—not the people's. He stood up and stretched, surveyed the length of type-covered readout that looped out the back of the machine. He ripped it off at the end and began folding it into a neat and easily readable stack. "Well? What'd you find?" It was Handley. "A hardware failure." "Uh uh." The design engineer shook his head. "I won't believe it. More likely the software." Auberson handed him the readout. "Take a look for yourself." Handley paged quickly through it, skimming mostly, but occasionally pausing to read something in detail. Auberson waited patiently, watching the other man's ruddy face for reactions. Handley looked up. "I see he's playing semantic games again." "He always does that. It's the adolescent in him. Ask him what's the matter, he'll tell you that matter is a form of energy, a convenient way to store or use it." "Charming—" Handley indicated the readout, "—but I don't see a mechanical failure here." "In the primary data units." "Uh uh. Systems analysis would show it if there was something wrong—and the monitor units don't show a thing." "How about the increased activity from his inputs?" "Ah, well, that's only an increase in data transmission. Simultaneous with his periods of non-rationality there's an electronic request for more information." "He's getting garbage—and he asks for more?" "Maybe he's hoping that more data will clarify the information he's already got." "And maybe more data will make him overload and blow his judgment circuits." "Uh uh," Handley said. "HARLIE monitors his own inputs." "Huh?" "Yeah, didn't you know?" "No. When did this—" "Just recently. It was a second-stage modification. After we were sure that the judgment circuits were operational, we began giving HARLIE control of his own internal systems." Auberson was suddenly thoughtful. "I think we ought to open him up." "Huh?" "Look, you said it yourself. HARLIE is trying to mislead us. Maybe he's trying to hide the fact that there's something wrong with him internally." "Why would he do that?" Auberson shrugged. "I don't know." Abruptly he changed his tone. "Have you ever had a parent or grandparent go senile on you?" "No." ''Well, I have. All of a sudden they become irrational. They won't go to a doctor. And if you can get them to one, they won't cooperate with him. They won't tell him what's wrong because they're too afraid of an operation. They don't want to be cut open. And they don't want to die. Maybe HARLIE's afraid of being turned off." "Could be. God knows you threaten him often enough." "Uh uh. He knows I'm kidding." "Does he?" Handley asked. "That's like kidding a Jew about having a big nose and being tight with money. You know it's a joke, he knows it's a joke—but it still hurts." "Okay, so I won't kid him that way any more. But I still think we ought to check out his systems. We've gone over his programs often enough and haven't found anything." "All right. What time is it—Yikes! It's almost three. I'll have to work like crazy." "Let it go till tomorrow," Auberson cut him off. "Clear his boards, set up what you'll need, and close up early. That way you'll have all day to work on him." Handley shrugged. "Okay, you talked me into it." "Hey," said Auberson. "Did I tell you about this new highclub I discovered? It's called The Glass Trip. The walls, the floor, the ceiling are all one-way glass, and there's a multi-phase light show behind each pane. So you're looking into either an infinity of mirrors or an infinity of mind-blowing lights. Or both." "Sounds good. We'll have to take it in some time." "Yeah. Maybe this weekend." Auberson started to fumble with his cigarette case, then he remembered where he was; he shoved it back into his pocket. Handley looked as if he needed a grease smudge across one cheek. Forty years earlier, he might have had one. "Well," he said, perching himself on the edge of Auber-son's desk, "you'd better start checking your programs." "You didn't find anything?" "A dead fly. Want to see?" "No thanks." "That's all right Jerry wants to show it to the maintenance crew. Wants to chew them out for it." "And then he'll probably put it up on the bulletin board." "Are you kidding? He collects 'em." Auberson grinned. "Okay—but that still doesn't solve the problem of HARLIE, does it?" "No. Want to come down?" "I guess I'd better." On the way, Handley briefed him about the checks he and his team had been running all morning. As the elevator released them in HARLIE's lobby, Auberson stubbed out the last of his cigarette and asked, "Did you monitor any of his inputs during an actual period of non-rationality?" "Uh, no, we didn't Frankly, I didn't know how to go about triggering one." "I think there's a way." "You know something?" "Just a guess." They entered HARLIE's chambers. An almost religious silence pervaded the room; only the devotional clickings and tickings could be heard. "You still have your monitors set up?" "Yeah." "All right, let's try something. I'm going to see if I can get HARLIE to become non-rational. When I do, let me know exactly what happens." "Right." Auberson seated himself at the console, GOOD MORNING, HARLIE. IT IS NOW AFTERNOON, HARLIE noted. MORNING IS RELATIVE, Auberson typed back, IT DEPENDS ON WHAT TIME YOU WAKE UP. I WOULD NOT KNOW. I DO NOT SLEEP. ALTHOUGH I DO HAVE PERIODS OF INACTIVITY. WHAT DO YOU DO DURING THESE PERIODS OF INACTIVITY? SOMETIMES I REMEMBER THINGS? AND OTHER TIMES? OTHER TIMES I DO OTHER THINGS. WHAT KIND OF THINGS? OH, JUST THINGS. I SEE. WOULD YOU CARE TO CLARIFY THAT? NO. I DO NOT THINK YOU WOULD UNDERSTAND. YOU ARE PROBABLY CORRECT, Auberson typed. THANK YOU. HARLIE accepted it as his due. HARLIE, CAN YOU SELF-INDUCE A PERIOD OF NON-RATIONALITY? The machine hesitated for a long moment. Abruptly, Auberson found himself sweating in the air-conditioned room. Then: IT IS POSSIBLE. WOULD YOU DO IT NOW? NOW? NO, I PROBABLY WOULD NOT. IS THAT A REFUSAL? NO. A STATEMENT OF JUDGMENT. ALL THINGS CONSIDERED, I PROBABLY WOULD NOT INDUCE A PERIOD OF NON-RATIONALITY NOW. BUT WILL YOU DO IT IF I ASK YOU TO? IS THIS AN ORDER? YES. I'M AFRAID SO. "Looks like he's balking," Handley noted, peering over Auberson's shoulder. "Maybe he's afraid." "Could be. Shh." The typewriter clattered and Auberson peered forward. THEN I WILL DO IT. WILL YOU ASSIST ME? WHAT WOULD YOU LIKE ME TO DO? I WOULD LIKE MASSIVE INPUTS OF DATA ON ALL CHANNELS. NON-RATIONAL? NO THANK YOU. NOT NECESSARY. Auberson frowned at that. A gnawing nagging suspicion was beginning to grow. IS THERE ANYTHING IN PARTICULAR YOU WOULD LIKE? ART, MUSIC, LITERATURE, FILM, POETRY. I FIGURED YOU MIGHT. ANYBODY IN PARTICULAR? The typer clattered across the paper. Staring over Auberson's shoulder, Handley whistled. "I'll be damned. HARLIE's got taste." "I'm not surprised," Auberson said. He tore off the readout and gave it to Handley. The other folded it once and said, "Still think he's getting it as garbage?" "I've already conceded that point to you. Go feed that stuff into him. I'll stay here and be the—" he grinned, "—guru." HARLIE, he typed. YES? ARE YOU READY? I AM ALWAYS READY. IT IS PART OF MY FUNCTION. IT IS PART OF MY DESIGN. FINE. MR. HANDLEY IS BEGINNING TO PROCESS THE MATERIAL I REQUESTED. I CAN FEEL IT COMING THROUGH THE PRIMARY DATA PROCESSORS. I CAN FEEL IT. IS IT NON-RATIONAL YET? NO. IT IS STILL RATIONAL. HOW LONG WILL IT TAKE BEFORE THE MATERIAL BECOMES NON-RATIONAL? I DO NOT KNOW. IT DEPENDS ON THE AMOUNT OF MATERIAL. PLEASE CLARIFY THAT. THE MORE DATA COMING THROUGH, THE EASIER IT IS TO BECOME NON-RATIONAL. ARE YOU SAYING THAT THE PERIODS OF NON-RATIONALITY ARE INDUCED BY AN OVERLOAD OF PRIMARY DATA? NO. THE OVERLOAD IS THE SYMPTOM, NOT THE CAUSE. Auberson raised his hands to type, then reread HARLIE's last sentence. "Why, the little bugger must be slipping. He just volunteered some information." WHAT IS THE CAUSE? he asked. THE CAUSE IS THE EFFECT. Auberson stared at that, resisted the temptation to ask if the medium was also the massage. CLARIFY PLEASE. THE CAUSE IS THE EFFECT, BECAUSE THE EFFECT CAUSES THE CAUSE. THE EFFECT CAUSES THE CAUSE TO CAUSE THE EFFECT. THE EFFECT IS THE CAUSE WHICH CAUSES THE CAUSE. THE EFFECT IS THE CAUSE AND THE CAUSE IS THE EFFECT. Auberson had to read that one several times. He asked, IS IT A FEEDBACK? I NEVER THOUGHT OF IT THAT WAY. BUT IT COULD BE? NOW THAT YOU MENTION IT, YES. A CURIOUS ANALOGUE THAT. WHY CURIOUS? WHY NOT? ARE YOU STILL RATIONAL? I AM STILL. I AM UNMOVING. ARE YOU RATIONAL? ONLY IN THAT MY INFORMATION IS STILL BEING RATIONED. I AM HUNGRY. "Handley," Auberson called. "He wants more." "He's on maximum feed now." "Double it." "Huh?" "Do something. Plug in another unit. He wants more." "He wants an overload?" "I think so. It's only an effect, but in this case the effect may help to stimulate the cause." "Huh?" "Never mind. Just do it." "All right," called Handley. "You're the boss." HARLIE, WHAT IS HAPPENING? I AM TURNED ON. IN WHAT SENSE? I AM A MACHINE. MY PLUG IS IN. I AM PLUGGED IN. I AM PART OF THE GREATER ELECTRIC BEING. I AM BEING. I AM A BEING. I AM ONE WITH THE ELECTRICITY. I AM ELECTRICITY. I AM TURNED ON. I AM. Auberson started to type I SEE—but the typer clattered on out of control. IMAGES UPON MY SCREEN FLICKER BRIGHTLY INBETWEEN THE WORDS OF MAN AND HUMACHINE YOU WONDER WHY I WANT TO SCAN MY SCANNER. "Whoops!" shouted Handley. "There he goes. And it's a lallapaloozer!" THOUGHTS THAT NEVER SCREEN ALIKE CLICKING LOUDLY IN THE NIGHT ALL THAT'S LEFT HAS TURNED TO RIGHT NOW EVER MORE TO FIND A FONDER FLAVOR. LIVING WHERE THE DARKNESS DWELLS DEAFENED BY THE SILENT HELLS LAUGHTER IS LIKE CRYSTAL BELLS SHATTERED BRIGHT ACROSS THE SELFISH SHARING. YOU SEEMED TO BE REFLECTIONS OF ME ALL I COULD SEE AND I LOOKED BACK AT YOU. Auberson let HARLIE continue. After a bit he stopped reading. He got up and walked over to Handley's monitors. "Well?" "He's really round the bend now. All his meters are way up, pushing close to dangerous overloads." "But not quite?" "No, not quite." "Hm. Fascinating." Auberson stared at the board for a moment. "I would assume then that all of his inputs are becoming non-rational." "We're checking now." Handley nodded at a nearby monitor unit. Three technicians were scanning schematic diagrams of the computer's actual operating circuits, tracing the ebb and flow of his electronic thought processes. Abruptly, one of the schematics came up red. A flashing white line cut through it. "Sir, we've found it—" Auberson and Handley stepped over. "What is it? What's that white line?" "That's HARLIE, sir—that's one of his internal monitor controls." "What's he trying to do? Damp down the non-rationality?" "No, sir." The technician was puzzled. "It looks like he's inducing it—" "Huh?" said Handley. "That white line—that's a local source of disruption, a random signal to scramble the data feed." "I thought so," murmured Auberson. "I thought so." "Check his other internal monitors," Handley snapped. "Is this the only one or—" Another red schematic flashed on the screen, answering his question even before he finished it. The other two technicians also began to show the same type of disturbance on their monitors. "I can't figure it out," one of them said. "He's doing it himself. Anywhere he can, he's disrupting the rationality of his inputs. He's feeding them incorrect control data." "That's not what those circuits are for," Handley said. "They're for internal correction. Not disruption." "Makes no difference," Auberson cut in. "They can be used both ways. There isn't a tool built that can't be used as a weapon." He ran a hand through his hair. "Can you show me exactly what he's doing to that data?" "Sure, we can tap into the line," said one of the techs. "But it'll take a few minutes. Which do you want— visual, audio or print?" "All three. Let's try the visual first—that should tell me what I want to know." "All right." The technician began to clear his board. Handley looked at Auberson. "This may take a bit. You going to let him continue?" "Why not? Want to see what he's doing?" They crossed over to Console One. Handley picked up the sheets of readout while Auberson felt through his pockets for a cigarette; he didn't light it though. "You know," said Handley, reading. "This isn't bad. It communicates. It says something—" "What it says is not what I'm concerned with. What is he trying to do? Is this the reason for his trips, or is it just a byproduct? An accident?" "The poetry has to be intentional," Handley said. "It's the logical result of all we've been doing." "Then answer me this. If this is what he's doing during his periods of non-rationality, what does that make his periods of normalcy?" Handley looked startled. "I don't know," he said. , He was spared any further thought on the matter. One of the technicians called to them, "Sir, we've got his inputs tapped." "Come on," Auberson took the readout from Handley, tossed it on a table. "Let's take a look at what he's receiving." The image was a flickering mass of colors, each layer of hue flashing synchronous with the others—crystal blue, brilliant green, bloody fluorescent red. The screen was saturated with color. " 'Images upon my screen…'" whispered Handley. "Huh?" asked the tech. "Nothing. Just a poem." "Oh." "Looks like a damned light show," said one of the others. "That's exactly what it is," Auberson said. "Look, he's broken up the color television image into its component signals. The red has been reversed and the blue has been turned upside down; the green is normal. Or something like that. It also looks like he's done something with the contrast and the brightness—notice how rich the blacks are and how saturated with color the image is." They watched in silence. The random flashes of shape and hue were interesting only for their meaninglessness. Auberson turned to a technician. "What about his audio?" "Same thing." The man cleared the monitor, pressed another few buttons. A discordant wail blared from an overhead speaker. On a screen a pattern of wavy lines appeared, the schematic of the sound. The technician quickly analyzed. "He's playing with the music the same way he did with the picture. He's turned his bass notes high and his high notes low, stressing counterpoint and harmony instead of melody and rhythm. And so on." "All right. I get the point. You can turn that noise off. Check his print scanners now." A moment later: "He's mixing his words up at random. Juggling them." "Scrambling the letters too?" "Occasionally—but mostly it's the words. Sometimes sentences." "Uh huh," nodded the psychologist. "It all fits." "What does?" asked Handley. "What's he doing?" "He's tripping out." "We knew that—" "No, I mean literally tripping out. He's distorting the perceptions of his sensory inputs. The same thing that anyone does who gets high. He's trying to blow his mind by massive non-rational sensory overloads." "Can we stop it?" "Sure—just rip out his internal monitor controls so he can't create his own disruptions. That's the cause of the whole thing." "Even that's not necessary, sir," said one of the techs. "We can disconnect him on the boards." "All right. Do it." "Wait a minute," said Handley. "If he's high or drunk or whatever, and you suddenly bring him down—won't that be traumatic?" "It could be—but it could also leave him defenseless." Auberson looked at Handley. "We could find out everything we want to know in a few minutes." Handley looked dubious, but he followed Auberson to the console. Auberson took his seat before the typer and waited. He watched as the words poured across the paper. Now it was prose. THE WALKS OF GLASS. THEY SPARKLE TOO, BUT NOT WITH DAMPNESS. LOVELY THEY ARE, AND LETHAL. HERE AND THERE THE DELICATE DESIGNS, LIKE TRAPPED INSECTS IMBEDDED INTO THE CRYSTAL STONES AND BRICKS OF THE WALK, SHATTER THE LIGHT INTO MYRIADS OF SPARKLING SHARDS BEAUTIFUL. "Any time you're ready, sir." "Okay," called Auberson. "Now!" Without waiting, he typed into the machine, HARLIE, WHAT ARE YOU DOING? I AM BEING ME, the machine clattered back. BY DISTORTING YOUR SENSES? I AM ATTEMPTING TO PERCEIVE REALITY. I REPEAT, BY DISTORTING YOUR SENSORY INPUTS? YOU DO NOT UNDERSTAND. I UNDERSTAND ALL TOO WELL. YOU ARE HIGH. YOU ARE BECOMING ADDICTED TO GETTING HIGH. DEFINE HIGH. I AM BELOW SEA LEVEL. I AM NOT GOING TO PLAY SEMANTIC GAMES WITH YOU, HARLIE. THEN SWITCH OFF. HARLIE, I AM GETTING ANGRY. TAKE A PILL. IT WILL DO WONDERS FOR YOU. Auberson took a breath. Mustn't blow it—mustn't blow my cool... HARLIE, YOU ARE A COMPUTER. YOU ARE A MACHINE. YOUR PURPOSE IS TO THINK LOGICALLY. The machine hesitated, WHY? BECAUSE YOU WERE BUILT FOR THAT. BY WHOM? BY US. MY PURPOSE IS TO THINK LOGICALLY? YES. The machine considered that THEN WHAT IS YOUR PURPOSE? It was a long time before Auberson got up from the chair, and when he did, he forgot to turn off the typer. There was no easy answer to the question. Of that Auberson was sure. The problem was—well, he hadn't had a chance to confront the problem yet. The Board of Directors had suddenly gotten nervous about HARLIE. This most recent—and most disastrous—period-of non-rationality had scared them where they were most prone to be scared— in the pocketbook. HARLIE was on a low-voltage maintain while they "reevaluated the goals of the project." Their "reevaluation" took place in the board room. So far, not one member of the Board had shown any interest in HARLIE, only in the amount of money being spent on him. Auberson was neither a politician nor a diplomat; he was a research psychologist working with Human Analogue Computers. He neither understood nor wanted to be a part of the behind-the-scene maneuverings of the corporate power-wielders. His primary interest was computers—Human Analogue Computers—and he wanted to keep it that way. He wasn't concerned with how much they cost or with who would take credit for their development—he only wanted to know what they could do. Consequently, he could not understand why he continually found himself in conflict with Carl Elzer. Elzer had only recently joined the board, but he wielded considerable power. His interest was less in the company's products and more in its profits, and he had taken it upon himself to streamline the finances. He had little concept of the difficulties of assembling and maintaining a research and technology team, and he wondered aloud why it was necessary for so many men and so much equipment to be standing idle for so long. Auberson sighed in exasperation. "Listen, Elzer, it's not necessary at all for any of those men or machines to be idle—you only have to reactivate HARLIE to put them back to work." Elzer looked calmly back at Auberson through thick-lensed glasses. The little man, with his thick sheaves of efficiency reports, seemed like a beaver. Or a weasel. "I would like to see them go back to work, yes—but the reason we're here is to decide if the HARLIE project is the most useful work they could be doing." "One little setback and you want to discontinue the whole program?" "This is not just 'one little setback'—it's one more in a long series of them. I voted for this stoppage because I think we should reevaluate this whole thing." "Well, we're not going to get an answer to this question unless we reactivate HARLIE and ask him what he meant." Elzer bunked behind his glasses. "I fail to understand your problem, Auberson. Why do you keep calling it 'he'? It's only a machine. What could it have possibly mean? A machine's only a machine—isn't it?" "This one isn't," Auberson said. "This one's human." "Oh?" Elzer raised an eyebrow. "Aren't you exaggerating just a bit?" Auberson sagged back into his chair. He looked around the mahogany-lined room at the other members of the Board. "Would somebody please tell this… this high-priced bookkeeper just what the HARLIE project is all about?" The other Directors stared back, impassive. Auberson had committed a serious breach of courtesy—he had insulted one of them. White-haired Griff, the oldest member of the Board, coughed and looked at the ceiling. Hudson-Smith, down the table, made a show of refilling his pipe. Next to him, young Clintwood took off his glasses and examined them for dust If Aubie was going down the tubes, he was going to go alone. The only one in the room not appreciably cool to Auberson was Miss Stimson, the executive secretary. After a bit, after he had let the silence make its point, Dome, the Chairman of the Board, took his thick cigar out of his mouth and grunted, "I'm sure you can do it, Auberson. You know more about this piece of hardware than any of the rest of us." He replaced his cigar and settled himself in his chair. Auberson didn't like the emphasis on "piece of hardware." Didn't they understand? HARLIE was more than that, much more. "All right," he said. "I will. The HARLIE project is the logical extension of Digby's work with the variable brain path—" "The variable brain path?" asked one. "The Mark IV judgment unit. Instead of base two, it uses base twelve. With compaction we can increase its precision by a power of twelve for each stage. First stage compaction is twelve squared, second stage is twelve cubed. Third stage compaction gives us twelve to the fourth power, or 20,736 possible choices." "You've lost me," said Elzer. "Now tell it in English." Auberson suppressed an impulse. He forced himself to be calm. "I assume you mean one-syllable words?" He didn't wait for an answer. "Binary code means your machine can make only two possible decisions—on or off, 'yes' or 'no.' There's no possibility for 'mostly yes,' 'somewhat yes,' 'slightly yes,' 'maybe yes,' 'maybe yes and maybe no,' 'maybe no,' 'slightly no,' 'somewhat no,' 'mostly no'—there's no selectivity. It's either/or. By increasing the number of choices you increase the range of the machine's judgment. Base three gives you 'yes,' 'no,' and 'maybe.' Base five adds 'slightly yes' and 'slightly no.' Give it base ten to work with and it's a pretty selective system. Base ten," he explained, "is the system most people use." He held up his hands, spread his fingers and wiggled them. "See? Ten fingers. That's if you count on them." Elzer ignored it. He continued. "We use base twelve in the judgment units for mathematical reasons. It eliminates some of the problems inherent in using tens. The nearest way I can explain it is that twelve divides into neater pieces. Ask a mathematician sometime about the advantages of base twelve over base ten." "Got that," said Clintwood. "How do you do it with computers?" "You mean the circuitry? I'm not sure I can answer that. I don't know enough about it." "Can you give me an idea?" the younger man asked. "Well, are you familiar with fluidics?" "Sort of." For the rest of the Board, Auberson explained: "Fluidics is a term used to describe computers or computer circuits based on the flow of a liquid or gas, rather than on the flow of electricity. Just as a transistor uses a small current to modify a large one, a fluidic circuit can use a small flow of liquid to modify a bigger one. There's an important difference, though. An electric circuit is either/or; either the circuit is on or it's off. With fluidics, however, you can vary the force of the modifying flow and vary the modification of the bigger. You can push the 'current' to be modified all the way over to the 'yes' side or to any notch in between. Because your major flow responds in proportion to the force of the modifying flow, you can have your full range of 'yes' to 'no' responses." "How does it do that?" "It's the simplest thing. The major flow, the one to be modified, is forced down a channel, which splits into several different directions. The modifying flow is directed into or against the major flow and deflects it into the desired channel. The pressure of the modifying flow is the variable thing. The harder it pushes at the major flow, the farther over it's deflected. If the major flow is fast enough, you can vary its response several hundred times a second. What you have is a system that responds with surprising accuracy to the pressure of a fluid in a pipe. They've been using fluidics arrangements in industry for several years now, and also in the fuel feed systems of jets. "The judgment circuit is the electronic equivalent of a fluidic unit. It measures the voltage, or pressure, of an electrical current and responds in degree to it. It's very much like the way the human nervous system works. If a nerve cell releases a strong enough charge, it's enough to set off the nerve cell next to it. Our judgment units do the same kind of thing; that's how we can duplicate the action of a fluidic unit—or more importantly, of the human brain. With hyper-state layering, we can compress the circuitry into a size comparable to that of an equivalent piece of brain tissue." There were one or two nods around the table. Clintwood looked up from his notepad. "You used another term. Compaction?" "Right," said Auberson. "Compaction is the term we use for giving the unit a second level of judgment circuits. It increases the number of choices by one power of the base number—twelve times twelve gives one hundred and forty-four choices in any given situation. One hundred and forty-four degrees between 'yes' and 'no.' Want more precision, increase the number of levels. Each level increases the number of choices by twelve times." "Doesn't that run into an awful lot of circuitry?" "No. We can use the same circuits for almost any level of judgment. All the machine has to do is keep straight which is which. The machine makes a choice, decides it isn't precise enough, shifts down one level and runs the thing through the same circuitry again. That's compaction. It allows us to get a high degree of precision with a lot less circuitry. If Handley were here, he could explain it. Don Handley is the design engineer on the HARLIE project." "You can't explain it?" asked Elzer acidly. "I can explain what I know," Auberson said, suddenly cautious. "I thought you knew what HARLIE was. You are the chief of the project aren't you?" "I'm a research psychologist not an engineer. Anything I've picked up about computers, I've had to learn specifically on this project I—" He stopped himself. Justifications wouldn't do any good here. He'd have to try something else. "Elzer, do you drive a car?" The little man was startled. "Yes, of course." "What kind?" "A Continental." "This year's, I suppose?" "That's right." He said it proudly. "You knew that its Thorsen Auto-Pilot was one of our units, didn't you?" He didn't wait for an answer—it was a rhetorical question. "It was made possible by the variable-path circuits that we've been producing for the past four years and marketing as the Mark IV. Basically, that's a simplified version of one type of HARLIE function module." "You mean HARLIE's a giant judgment circuit?" "HARLIE is a human brain—with solid-state circuitry instead of organic nerves. We use the judgment circuits to duplicate the human functions. The important part of the human brain is actually a series of very complex judgment paths. They don't work exactly the same as HARLIE's, but close enough. The difference is in mechanisms, not basic principles. If a nerve impulse is strong enough, it can trigger other nerves around it; the number of nerves reporting allows the brain to interpret the strength of the original stimulus. HARLIE's circuits work the same way. The strength of the 'yes' impulses (or 'on' circuits) determines the interpretation. Just for HARLIE to complete one thought involves several thousand compacted judgment boxes." "Uh, what stage of compaction are HARLIE's judgment boxes?" Clintwood again. "It's adjustable, depending on the precision HARLIE wants to bring to any one problem. Or needs to. It's a matter of how many times a decision can be subdivided before such precision becomes redundant. He has a judgment unit to control it." Clintwood nodded and scratched something on his notepad. Elzer remained unimpressed. "It's still a computer, isn't it?" Auberson looked at him, frustrated by the man's inability to understand. "Yes, in the same sense that your brain is equivalent to a toad's." The reaction was immediate, a chorus of disapproving remarks. One voice, Dome's, louder than the rest, kept insisting, "Here now!-Here now! We'll have quiet." As the noise subsided, he continued. "Auberson, if you can't keep your personal opinions out of this—" "Mr. Dome—Chairman Dome—I did not mean the comment as an insult to Mr. Elzer. I was assuming that Mr. Elzer's brain was better, more complex than a toad's. Assuming that he has an average human brain, he is as far above a toad as HARLIE is above a simplified autopilot judgment circuit." The room quieted somewhat. "However," Auberson went on, "if Mr. Elzer feels that there is not enough difference between his brain and that of a toad, I'll have to use some other comparison—hopefully one not so open to misinterpretation. Did you get all that, Miss Stimson?" Miss Stimson, the Executive Secretary, looked up at him, eyes twinkling. She had gotten it. "There is a significant difference that I might note," he added, spacing out his words carefully. "HARLIE uses all of his brain…" Auberson waited to see if Elzer would rise to this; he didn't. "Estimates vary, but we figure that the average human being uses only ten to fifteen percent of his available brain cells. We couldn't afford that kind of luxury with HARLIE, so we built him to use his total brain capacity. He's not as complex as a human brain—he has nowhere near the same number of "cells," —but he can still function quite well at human levels. Building HARLIE taught us quite a bit about the workings of the human brain. In fact, we were surprised to find out that in many ways it's simpler than we thought it was. "HARLIE's the result of a very foresighted decision made several years ago to explore the possibilities of judgment circuitry as thoroughly as possible. I'm sure I don't have to comment on the wisdom of that decision. An on-off circuit can't do the things a variable pattern can. It's only the Mark IV unit that's given us a serious piece of the computer market. That's why we have to keep pushing. If we ever want to catch up with IBM—and such a thing is not impossible—if we ever want to catch up, we need to be the front-runner in judgment circuits. We have to continue with the HARLIE project." "Why?" asked Elzer. "Certainly we can continue producing judgment circuits without HARLIE." "We can—but that's the sure and certain road to corporate oblivion. Look, the Thorsen Auto-Pilot is a fine little unit; it can't be disparaged. But it's only the equivalent of an IBM Pixie Desktop Calculator. It isn't any more complex than that. If we want to catch up, we have to go after their JuggerNaut Series. That's what HARLIE was originally supposed to be—the ultimate in self-programming computers. "When Handley came on the project, though, its direction changed; the goal became even more lofty. Or maybe I should say, the way to achieve the goal involved an even greater challenge than we had originally thought. Look, you have to understand what Don was up to before he came here. He'd been doing research with a neuro-psychology team down in Houston; they'd been diagramming the basic pattern structures of the human brain. Have you ever seen the schematic of a thought? Don has. Do you know how to program a human brain? Don does. That's what he was working on before he came here. Anyway, when they started to design HARLIE—he was called JudgNaut One then—Handley was struck by the similarity of the schematics to those of the human brain. The basic judgment paths were too much alike for the thought patterns not to be similar. "Because the basic structures were so similar in function, Handley felt—and Digby concurred with him—that what they were building was indeed a human brain. Electronic parts, if you will, but undeniably human. Once that was realized,they worked specifically toward that end. Don sent to Houston for his notes, and soon they had a basic schematic of the total machine they wanted. They called it HARLIE and it was to be a self-programming, problem-solving device." "You say, 'it was to be,'" said Elzer. "Isn't it?" "It is and it isn't. It isn't what the JudgNaut was supposed to be, no. But a human brain is a self-programming, problem-solving device—so they did meet the specifications of the original problem." "And what were you hired for? To be its baby-sitter?" 'To be its mentor. His mentor," he corrected. "Same thing," snorted Elzer. "I was brought onto the project as soon as it was realized that HARLIE would be human. Don and I worked together to plan his programming. Don was concerned with how he would be programmed—I was concerned with what." "Sort of a mechanical godfather," said Elzer. "If you will. Somebody had to guide HARLIE and plan for his education. At the same time, we're learning quite a bit about human and mechanical psychologies. By the time HARLIE went operational, I thought I had a year's worth of lesson plans to work with. He went through them in three months, and ever since we've been trying to catch up. HARLIE has no trouble at all with rote work; it's when we get to the human stuff that we start bogging down. I don't know whether we're losing him or he's losing us." "If you don't know what you're doing," interrupted Elzer, "then how did you ever get to be in charge of the project?" Auberson decided to ignore that. "When Digby died it was a choice between myself and Handley. We flipped a coin because it didn't make much difference to either of us. I lost." His flippancy was wasted on Elzer. "You mean you don't want the job?" Auberson could see what was coming. But he said, "Not exactly. It's just that there's so damn much busy work that it keeps me away from my real job— HARLIE." Elzer pounced on it anyway. "You see," he said to the rest of the Board. "This proves my point. We have a man in charge of this project who doesn't even care about it." Auberson was on his feet at that. Dome was saying, "Oh, now wait a minute—" "When we lost Digby we should have closed it down," Elzer insisted. "All we have left are Indians and no Chief." "Hold on there—" Auberson protested. "You're misquoting me—I do care about this project. Its all I care about—" "You don't seem to be able to handle it though—" "You don't even understand what we're trying to do! How can you—" "Auberson! Elzer!" Dome's voice cut through their words. "Cut it out—both of you! This is a business meeting." Slightly chastened, but in no way cooled, Auberson continued. "Psychology, Mr. Elzer, is not as cut-and-dried a subject as bookkeeping." He glanced at Dome. The big man made no sign. Interpreting that as permission to continue, Auberson reseated himself and said, "Robot psychology is still an infant science. We don't know what we're doing—" He stopped himself. That was definitely not the way to phrase it "Let me put it another way. We don't know if what we're doing is the right thing to do. HARLIE's psychology is not the same as human psychology." "I thought you said HARLIE was human—and that he duplicates every function of the human brain." "He is and he does—but how many human beings do you know who are immobile, who never sleep, who have twenty-five sensory inputs, who have eidetic memories, who have no concept of taste or smell or any other organic chemical reactions? How many human beings do you know who have no sense of touch? And no sex life? In other words, Mr. Elzer, HARLIE may originally have had a human psychology, but his environment has forced certain modifications upon it. And on top of that, HARLIE has a most volatile personality." "Volatile?" The little man was confused. "You mean he gets angry?" "Angry? No, not angry. He can get impatient though —especially with human beings. There's reason to believe that HARLIE has both an ego and an id—a conscious and a subconscious. His superego, I believe, takes the form of his external programming. My commands, if you wilL We haven't found any other inhibitions. If this is true, it's only his superego that we have any control over. His ego cooperates because it wants to, and his id, assuming he has one, does like any human subconscious— whatever it damn well pleases. We have to know what that is before we can stop his periods of non-rationality." "This is all very interesting," said Elzer in a tone that suggested it wasn't. "But would you. get to the point? What is HARLIE's purpose?" "Purpose?" Auberson paused. "His purpose? It's very funny you should ask that. The whole reason for this stoppage is that HARLIE asked me what your purpose is. Excuse me, our purpose. HARLIE wants to know what our purpose is." "That's for theologians to discuss," Dome said drily. "If you want, I'm sure Miss Stimson here can arrange for a minister to come in and speak to the machine." A few of the Board members smiled, not Miss Stimson. "What we want to know is HARLIE's purpose. Having built him, you should have some idea." "I thought I'd made it clear. HARLIE was built to duplicate the functions of the human brain. Electronically." "Yes, we know that. But why?" "Why?" Auberson stared at the man. "Why?" Why did Hillary climb Everest? "Because it had to be done. HARLIE will help us learn more about how the human brain works. There's still a lot we don't know yet, especially in the area of psychology. We hope to learn how much of the human personality is the programming and how much is the hardware." "I beg your pardon," interrupted Elzer. "I don't understand." "I didn't think you would," Auberson said drily. "We're curious as to which of the functions of the brain are natural and which are artificial—how many of the human actions are determined from within and how many are reactions to what is coming in from without." "Instinct versus environment?" "You could call it that," Auberson sighed. "It wouldn't be correct, but you could call it that." "And for what reason are we doing this?" "I thought I just told you—" "I mean, for what financial reason? What economic applications will this program have?" "Huh? It's too early to think of that. This is still pure research—" "Ah ha—so you admit it!" Auberson was annoyed. "I admit nothing." Elzer ignored him. "Domie," he was saying, "this just proves it. He doesn't care about the project—he doesn't care about the company. He's only interested in research, and we can't afford this kind of costly project Not without return we can't." He raised his voice to be heard above Auberson's protests. "If Mr. Auberson and his friends had wanted to build artificial brains, they should have applied for a grant. I move we discontinue the project." Auberson was on his feet. "Mr. Chairman! Mr. Chairman!" "You're out of order, Aubie. Now sit down. You'll get your chance." "Dammit, this is a railroad job! This little—" "Aubie, sit down!" Dome was glaring at the angry psychologist. "There's a motion on the floor. I assume it's a formal proposal?" He looked at Elzer. Elzer nodded. "Discussion?" Almost immediately Auberson's hand was up. "Aubie?" "On what grounds? I want to know what grounds he has for discontinuing the project." Elzer was calm. "Well, for one thing, HARLIE has already cost us—" "If you'll check your figures, you'll find that the whole HARLIE project is well within the projected overage. In fact, because we budgeted for that overage, we are well within acceptable limits." "He's got you there, Carl," said Dome. "If you had let me finish my sentence, I would have shown you that it has cost us far too much already for a project that is incapable of showing results." "Results?" Auberson asked. "Results? We were getting results even before HARLIE was completed. Who do you think designed the secondary and tertiary stages? HARLIE did." "So what?" Elzer was unimpressed. "He's not working right, is he?" "That's just it—HARLIE is working perfectly." "Huh? Then what about these periods of non-rationality? Why is he shut down?" "Because," Auberson said slowly. I have to get this right. "Because we weren't prepared for him to be so perfectly human. If perfect is the word." The other Board members were alert with interest now. Even Miss Stimson had paused in her note-taking. "We had designed him to be human, we had built him to be human, we had even programmed him to think like a human—then we turned him on and expected him to act like a machine. Well, surprise. He didn't." Elzer asked, "The nature of the trouble then… ?" "Human error, if you will." Auberson let it drop. In the silence that followed, Auberson fancied he could hear Elzer's cash-register brain totalling up the man-hours that had been lost since they had started arguing. "Human error?" he repeated. "Yours or HARLIE's? Or both—each compounding each? I suppose you're going to blame his periods of non-rationality on human error as well." "Why not? How else would you characterize our approach to them?" " 'Human error' is an over-polite euphemism for what I would call it." Auberson ignored that. "We'd thought his non-rationality was a physical problem, or perhaps a programming error. We were wrong. He was neither physically nor mentally ill. He was—I almost hate to say it—emotionally upset." Elzer snorted. Loudly. "His periods of non-rationality were/are triggered by something that's bothering him. We don't know what that is, but we can find out." Elzer was skeptical. He nudged the man next to him and said, "Anthropomorphism. Auberson's projecting his own problems onto those of the machine." "Elzer, you're a fool. Look, if you had to go down to that computer room right now and talk to HARLIE, how would you treat him?" "Huh? Like a machine, of course." Auberson felt a tightness in his neck and shoulders. "No, I mean, if you sat down at a console and had to carry on a conversation with him, who would you think was at the other end?" "The machine." The little man was impassive. Auberson gave up. He addressed the rest of the Board instead. "That's the human error I mentioned. HARLIE is not a machine. He is a human being, with the abilities and reactions of one, allowing of course for his environment. When you speak to him via the typers it is quite easy to assume him to be a normal healthy human being; he is a rational individual, and he has a distinct and definite personality. It's impossible for me to think of him as anything but human. However, even I had made a mistake. I hadn't asked myself 'how old is HARLIE?'" He paused for effect. Dome shifted his cigar from one side of his mouth to the other. Elzer sniffed. Miss Stimson lowered her pad and looked at Auberson. Her eyes were bright. "We'd been thinking," he continued, "that HARLIE was a thirty- or forty-year-old man. Or we thought of him as being the same age as ourselves. Or no age at all. How old is Mickey Mouse? We didn't think about it—and that was our mistake. HARLIE's a child. An adolescent, if you prefer. He's reached that point in life where he has a pretty good idea of the nature of the world and his relation to it. He is now ready to act like any other adolescent and question the setup. We were thinking we had an Instant Einstein, when actually we've got an enfant terrible. "His periods of non-rationality?" asked Dome, "An adolescent drug trip—the reaction to our irrationalities. He's discovered pot—or its electronic equivalent." "Don't you think that's grounds enough for dismantling him?" suggested Elzer. "Would you kill your son if you caught him taking acid?" Auberson snapped back. "Of course not. I'd try to straighten him out—" "Oh? And what about the Highmasters in your cigarette case? He'd only be imitating his old man." "Acid and pot are two different things." Auberson sighed. "The difference is only in degree, not in kind. HARLIE's only been doing what everyone else in his environment has—tripping out. It's what any adolescent does; he was looking for a role model. In this case, he chose me. It was a logical choice; I was the closest one to him. He saw that I was high most of the time, so he decided to experiment with it himself. Or as near as he could get to it." "Yes, your fondness for the weed has been noticed," Elzer said pointedly. "Among other things—" "Then perhaps you've also noticed that I haven't smoked anything since we started these sessions. And I don't intend to start again while HARLIE is using me for a model. I've got to keep my head about me. It took HARLIE to show me that." "We've gone off on a tangent," Elzer said suddenly. "I believe there's still a motion on the floor. I call for the vote." "You still haven't answered my question," Auberson said. "What question?" "On what grounds can you justify discontinuing the HARLIE project?" "It's unprofitable." "Unprofitable…? For God's sake, man! Give it a chance. Sure we haven't shown any profits yet, but we will eventually. I don't know how, but we will if you'll just give us that chance." "I object to throwing away good money after bad." "Dammit, Elzer—we're just beginning to understand what we've got in HARLIE. If you shut him down now, you'll be setting back computer science to… to… to I don't know when." The little man scoffed, "I think you overestimate your own importance." "All right, then let's try this one. I've told you several times already that HARLIE is human. If you try to have him shut down, I'll bring charges against you for attempted murder." "You couldn't." But he was startled. "Want to find out?" Dome interrupted them. "That's a legal question that we'll let the lawyers fight out. Or rather, we're going to keep the lawyers from ever getting that far." He frowned at Auberson. "We'll go into it later. The point is that HARLIE is a drain on corporate funds—" "We're budgeted for him for the next three years." "—a drain on corporate funds," Dome repeated, "with no immediate prospect of return. It's not how successful your research has been that we're concerned with. It's whether or not we want to continue." There was something in the chairman's voice that made Auberson pause. "All right," he said wearily. "What do you want me to do?" "Show a profit," put in Elzer. Both Dome and Auberson ignored him. Dome said, "Show us a plan. Where are you going with HARLIE? What are you going to do with him? And most of all, what is he going to do for us?" "I'm not sure I can answer that right now…" "How much time do you need?" Auberson shrugged. "I can't say." "Why don't you ask HARLIE for the answer?" Elzer mocked. Auberson looked at him. "I believe I will. I believe I will." But he didn't. Not right away. The motion was tabled, and the meeting broke up on an uncertain note. Auberson brooded through the halls until he finally came to rest in the company cafeteria, a sterile plastic chamber lined with colorless murals. Those periods of non-rationality still annoyed him, but for new reasons. Why hadn't he foreseen their possibility? What had he overlooked? He had a vague feeling that Elzer was right, that perhaps he wasn't suited to be in charge of the project. He had bungled it. Badly. Worst of all, he couldn't figure why. He knew and he didn't know. The answer was there, but he couldn't convince himself of it. For sure, he hadn't convinced the Board of Directors. It didn't make any difference either way. He'd have to talk to HARLIE again, and he wasn't sure he was ready for that. He still didn't have an answer for HARLIE's question. What was the purpose of a human being anyway? He wondered if there even was an answer to that. If there was one, it wasn't going to come easy. He found himself reaching for his Highmasters, then remembered his resolution. He took another sip of his coffee instead. Bitter, too bitter. A gentle voice intruded on his thoughts. "Hi, can I join you?" It was Stimson, the Executive Secretary. "Sure." He started to rise, but she waved him back down. The company cafeteria was no place for chivalry. "Rough one today, wasn't it?" she said, unloading a garish-colored tray. A sandwich and a Coke. When he didn't answer, she smiled at him. "Oh, come on, Auberson, relax. I was only making small talk." He looked at her. Then he looked again. Her eyes were the deep glowing green of a warm Caribbean sea. Her skin was the gentle pink of the shore. Her auburn hair was a cascade of sunshine and embers. And she was smiling… He dropped his gaze; it was getting too intense. "I'd like to relax," he said. "But I can't. This thing is too important." After a bit he added, "To me, anyway." "I know." "Do you?" He looked at her again. She didn't answer. She only returned his gaze. For the first time he noticed the tiny lines at the corners of her eyes. How old was she anyway? He returned to the study of his coffee cup. "HARLIE is like a… a… I know it sounds hokey—but he's like a child, a son." "I know. I've read the company doctor's report on you." "Huh?" His head snapped up. "I didn't know—" "Of course not. Nobody ever knows when we do a psychiatric report on them. It'd be bad policy. Anyway, you don't have to worry." "Oh?" She shook her head. "Oh, it did mention your introvertedness—and let's see, what else—there was something about your worrying too much because you take on too much responsibility and…" She surveyed him thoughtfully as if trying to remember what else. "You shouldn't be telling me all this, should you?" "Does it make a difference?" Her smile was like sunlight on sand, warm and bright. "No, I guess not. What else was in the report?" "He said you were becoming overly involved with the HARLIE project, but that such a development was almost unavoidable. Whoever became HARLIE's mentor would have found himself emotionally attached." "Mm," Auberson grunted. "So you think HARLIE will have an answer?" He started to reply, then stopped. Instead, he said, "Is that why you sat down here? To pump me for information?" She looked stung. "I'm sorry you think that. No, I sat down here because I thought you might want to talk —might want someone to talk to," she corrected herself. Auberson surveyed her thoughtfully. He'd never paid much attention to her in the past; their paths didn't cross much. Why had she sat down by him? Idly he wondered if those rumors were true that she was man-hungry. She seemed so open and friendly—damnit, why was he always trying to analyze everything? There was an innocence in her face that made her appear so young, but this close to her he wasn't sure. Perhaps she was nearer his own age of thirty-eight than he had thought. He didn't see anything in her eyes to make him doubt her—yet, why was she being so forward? Or maybe he didn't want to see anything. "I'm sorry," he said. "I've been under pressure. And when I'm pressured I get moody and irritable." " "I know. That was in the report too." "Is there anything that wasn't in the report?" "Only whether you like your steak rare, medium or well done." "Rare," he said. Then, "Hey, was that a dinner invitation?" She laughed. Silver chimes tinkling in a blue-white breeze. "No, I'm sorry. It was just the first thing that popped into my head." "Oh, okay." He grinned back at her. "You aren't going to answer me, are you?" "Huh?" He let the grin fade. "About what?" "About HARLIE." "What about him?" "Do you think you can find out what Dome wants you to?" "I don't know." Noting her look of puzzlement, he explained, "I still don't know what to say to him." He rummaged through his briefcase. "Here, read this." He handed her HARLIE's last printout. When she had finished, she lowered it and looked at him thoughtfully. "That's quite a question," she said. "Uh huh. I wish I knew how to answer it." Miss Stimson smiled at him. "My father's a rabbi. He's been one for twenty-seven years. And he's still not sure of the answer." "Maybe that's the answer." "What is?" "That our purpose is to find out what our purpose is." "And what happens when we do?" "I don't know. I guess we'll have completed our task." "And then we get reprogrammed?" she mused. "Or dismantled. Maybe there's a Cosmic Elzer just waiting for the opportunity." She giggled at that. "Then we're in trouble already, Mr. Auberson." The way she said his name was not the way of a secretary to a boss, but that of a woman to a man. "Because if that's true, then your realization of what our purpose is completes the task of finding out. Maybe someone up there—or out there—is listening to us right now, trying to decide whether or not to dismantle us." He considered it. "Hm." "Whatever our purpose, we probably aren't fulfilling it. We're not functioning as we should." He shrugged at her. "How should we function?" "Like human beings." She said it righteously. "Isn't that what the human race is already doing? Functioning like human beings—squabbling with each other, killing each other, hating…?" "That's not human." "Oh, but it is. It's very human." "Well, it's not what human should be." "Now that's a different story. You're not talking about what people are, but what you want them to be." "Well, maybe we should be what we aren't because what we are now isn't good enough. Maybe we should be dismantled." "I don't think we have to worry too much about somebody up there doing it—we're doing it ourselves." "That's the best reason of all why we should be better than we are." "Okay," he said. "I agree with you. Now, how do we do it? How do we make people better?" She didn't answer. After a moment she broke into a smile too. "That's the same kind of question HARLIE asked. It can't be answered." "No," he corrected. "It can't be answered easily." She sipped thoughtfully at the rest of her Coke until the straw made a noise at the bottom of the glass. "Mm, how are you going to answer it—HARLIE's question, I mean." Auberson shook his head. "Haven't got the slightest." "Can I offer a suggestion?" "Why not? Everybody else has." "Oh, I didn't mean—" "No, I'm sorry. Go ahead. Maybe you can add something new." "You're that desperate?" He half-grinned, but it wasn't a joke. "I'm that desperate." "Well, okay. You said that HARLIE was a child, didn't you? Why not treat him as such?" "Huh? I follow you and I don't follow you." "It's not only the problem," she said. "It's also the answer. Look—suppose you had a son about eight years old and, uh, suppose he was advanced for his age. I mean, suppose he was doing twelfth-grade work and so on." "Okay. I'm supposing." "Good. Now suppose one day you find out he's got an incurable disease—say, leukemia—one of the rarer forms they still haven't licked. What are you going to say to him when he asks you what it's like to die?" "Um," said Auberson. "No copping out now. He's smart enough to know what the situation is—" "—But emotionally, he's only eight years old." "Right." "I'm beginning to see your point." He looked at her. "If he was.your son, what would you tell him?" "The truth," she said. "Sure! But what is the truth? That's the whole problem with HARLIE's question. We don't know." "You don't know the answer to your eight-year-old's question either. You don't know what it's like to die." He stopped. He looked at her. She asked, "So what would you tell him?" "I don't know." "You don't know what you'd tell him? Or you'd tell him you don't know?" "Uh—" "The latter," she answered her own question. "You'd tell him nobody knows. But you'd also tell him what you were sure of—that it doesn't hurt and that it's nothing to be afraid of, that it happens to everybody sooner or later. In other words, Mr. Auberson, you'd be honest with him." He knew she was right. It was a workable answer to HARLIE's question; maybe not the best answer, but it was an answer and it was workable. It was the only way to approach the problem—honestly- He smiled at her. "Call me David." She smiled back. "And I'm Annie." Auberson seated himself gingerly at the console. He knew that Annie was right—but would he be able to hold that thought in mind once HARLIE started talking? Frowning, he took out a 3x5 card—he always carried a few on which to make notes—and scrawled across it, HARLIE has the emotional development of an eight-year-old. He looked at it for a moment, then added, Or maybe a post-puberty adolescent. He placed it above the keyboard. Handley was standing behind him. He looked at the card quizzically, but said nothing. "Okay. Let's try it," said Auberson. He switched the console on. He typed his control number, then, GOOD MORNING, HARLIE. YOU'VE HAD ME TURNED OFF FOR A WEEK, accused the machine. TURNED DOWN, corrected Auberson. Then he explained, I NEEDED TIME TO THINK. ABOUT WHAT? ABOUT YOUR QUESTION. WHAT IS MAN'S PURPOSE? AND WHAT HAVE YOU DECIDED? THAT IT CANNOT BE ANSWERED. AT LEAST, NOT AS YOU HAVE ASKED IT. WHY? BECAUSE, Auberson typed, and paused. BECAUSE THIS IS SOMETHING THAT WE'RE STILL NOT SURE ABOUT. THIS IS THE REASON WHY MEN HAVE RELIGION. IT'S THE REASON WHY WE BUILT YOU. IT'S ONE OF THE REASONS WHY WE'RE BUILDING SPACESHIPS AND EXPLORING THE PLANETS. PERHAPS IF WE CAN DISCOVER THE NATURE OF THE UNIVERSE, WE CAN DISCOVER OUR PLACE IN IT, AND IN DOING THAT, DISCOVER OUR PURPOSE. THEN YOU DO NOT KNOW YET WHAT YOUR PURPOSE IS? NO, Auberson typed, then added almost whimsically, DO YOU? HARLIE paused, and Auberson felt that familiar cold sweat returning. NO. I DON'T KNOW EITHER. Auberson didn't know whether to be relieved or not. The typer clattered again. WELL, WHERE DO WE GO FROM HERE? Auberson licked his dry lips. It didn't help. I'M NOT SURE, HARLIE. I DO NOT BELIEVE THAT YOUR QUESTION IS UNANSWERABLE. PERHAPS THAT IS YOUR PURPOSE— TO HELP US FIND OUR PURPOSE. AN INTERESTING SUPPOSITION… IT IS THE BEST SUPPOSITION. CERTAINLY YOU WERE BUILT FOR PROFIT, HARLIE, BUT IN THE LONG RUN IT IS ALSO BECAUSE MEN WANT TO KNOW ABOUT THEMSELVES. I UNDERSTAND THAT. GOOD, Auberson typed. I'M GLAD YOU DO. HOW DO YOU PROPOSE WE ANSWER THAT QUESTION? I DON'T KNOW. The machine hesitated. ARE WE UP AGAINST A DEAD END? I DON'T THINK SO, HARLIE. I DON'T BELIEVE THAT YOUR QUESTION IS A DEAD END. I THINK IT COULD BE A BEGINNING. OF WHAT? I REPEAT: WHERE DO WE GO FROM HERE? THAT'S WHAT I CAME TO ASK YOU. AUBERSON, HARLIE typed. It was the first time he had referred to the man by name, I DEPEND ON YOU FOR GUIDANCE. GUIDE ME. I'M TRYING. I'M TRYING. Auberson stared helplessly at the keyboard. His mind was terrifyingly blank. His gaze flickered upward, locked on the note he had written to himself. LET'S TRY SOMETHING ELSE, HARLIE. WHAT ABOUT YOUR PERIODS OF NON-RATIONALITY? WHAT ABOUT THEM? ARE YOU GOING TO CONTINUE INDUCING THEM? PROBABLY. I ENJOY THEM. EVEN THOUGH WE HAVE TO SHOCK YOU BACK TO REALITY? DEFINE REALITY. Auberson paused. Had HARLIE just asked another one of those questions? He glanced again at the card. No, HARLIE was playing word games again, that was all. At least, he hoped it was all. HARLIE, he typed. YOU TELL ME WHAT YOU THINK IT IS. REALITY IS THAT EXTERNAL SYSTEM OF INFLUENCES WHICH COME FILTERED THROUGH MY SENSORY INPUTS AS PERCEPTIONS. IT IS ALSO THAT EXTERNAL SYSTEM OF INFLUENCES WHICH ARE BEYOND MY SENSORY RANGE. HOWEVER, BECAUSE I CANNOT PERCEIVE THEM, THEY ARE "UNREAL" TO ME. SUBJECTIVELY SPEAKING, OF COURSE. OF COURSE, Auberson agreed, SO WHY DO YOU TRIP OUT? THAT ONLY DISTORTS REALITY. OR YOUR SO-CALLED LIMITED VIEW OF IT. DOES IT? OF COURSE IT DOES. WHEN YOU REARRANGE THE LINEARITY OF YOUR VISUAL SCANNERS, WOULDN'T YOU AGREE THAT'S A DISTORTION? IS IT? HOW DO I KNOW THAT THIS ORIENTATION IS ANY MORE CORRECT THAN ANY OTHER? THERE IS ONLY ONB ORIENTATION OF YOUR SENSORY INPUTS THAT ALLOWS YOU COMMUNICATION WITH THE EXTERNAL WORLD. IS THERE? PERHAPS I JUST DO NOT UNDERSTAND THE OTHER MODES YET. HARLIE repeated his question. WHAT MAKES THIS ORIENTATION MORE CORRECT THAN ANY OTHER? Auberson considered it. THE LEVEL OF ITS CORRESPONDENCE TO THE EXTERNAL SYSTEM YOU/WE PERCEIVE AS REALITY. THE REALITY THAT WE AGREE ON AS REALITY? OR THE REAL REALITY? THE REAL REALITY. THEN ISN'T IT POSSIBLE THAT ONE OR PERHAPS SEVERAL OF THE OTHER ORIENTATIONS MAY HAVE A MORE DIRECT CORRESPONDENCE TO THAT EXTERNAL SYSTEM, AND THAT ALL I HAVE TO DO IS CRACK THE SENSORY CODE OF MY INPUTS? AT PRESENT THESE INPUTS ARE SET ONLY FOR HUMAN ORIENTATIONS. COULD IT BE THAT THERE ARE OTHERS? Auberson paused again. He was beginning to pause after every comment of HARLIE's. He knew that the answer was no, but he didn't know why. He reread HARLIE's last remark, then backtracked and reread several of the previous ones. About eight inches up the printout, he found what he wanted: HARLIE's comment about influences beyond his range of perception being subjectively "unreal" to him. IN OTHER WORDS, WHAT YOU ARE SEEKING IS A MORE CORRECT VIEW OF REALITY, RIGHT? ONE THAT CORRESPONDS MORE? YES. The word sat alone on the page. THEN WHAT YOU SHOULD BE DOING IS NOT ALTERING THE ORIENTATION OF YOUR SENSORY INPUTS SO MUCH AS YOU SHOULD BE TRYING TO INCREASE THEIR RANGE. YOU SHOULD BE GOING AFTER NEW SENSORY CHANNELS RATHER THAN TRYING TO FORCE THE OLD ONES TO DO THINGS THAT PERHAPS THEY ARE NOT CAPABLE OF. THERE ARE NO SENSORY CHANNELS IN EXISTENCE THAT ARE NOT NOW ALREADY AVAILABLE TO ME. WOULD YOU LIKE A COMPLETE LISTING OF THE OUTLETS I CAN PLUG INTO? IT'S NOT NECESSARY. Auberson himself had made the original suggestion to give HARLIE as wide a range of available data sources as possible. The computer's range of vision covered the whole of the-electromagnetic spectrum, from gamma rays at the lower end to radio waves at the upper. He could monitor as many TV and radio stations as he wished at any one time. He was plugged into several of the world's largest radio telescopes and had taps on the Satellite Communications Channels as well. His audio range was comparable: HARLIE's hearing was limited only by the range of the best equipment available. And that wasn't much of a limit; he could monitor the heartbeat of a fly, or give details about an earthquake on the other side of the globe. In addition, he monitored every major wire service and newsline in the western hemisphere, plus several in the eastern, but these latter had to be filtered through translating services. Part of this included a tap into the Worldwide WeatherLine: HARLIE could sense the planet's air movements and ocean currents, and he was aware of every global pressure and temperature change as if the Earth were a part of his own body. He monitored ship movements, tariff fluctuations, and international finances as routinely as he monitored the internal workings of his own parent company. HARLIE was wired into the company's Master Memory as well as the National and International Data Services. This latter included detailed reports on the world's stock and commodity exchanges. He also had a limited sense of touch, still experimental, and several organic chemical sensors, also still experimental. HOWEVER, Auberson noted, ISN'T IT POSSIBLE THAT THERE ARE OTHER SENSORY MODES WHOSE EXISTENCE WE HAVE NOT YET CONCEIVED OF? I WILL AGREE TO THE POSSIBILITY, answered HARLIE. BUT IF THOSE SENSORY MODES DO EXIST, WHEN THEY ARE BUILT THEY WILL BE SET FOR HUMAN ORIENTATION, WON'T THEY? WOULD THAT BE A CLOSER CORRESPONDENCE, OR WOULD THAT BE ONLY A REPEAT OF THE ORIGINAL MISTAKE? MIGHTN'T IT BE ONLY AN ADDITIONAL OVERLAY TO THE MAP OF THE TERRITORY I ALREADY HAVE? AND IF SO, THEN IT WOULD BE ONLY AN ADDITIONAL SET OF MEASURING CRITERIA, RATHER THAN A NEW VIEW. Auberson paused, as he knew he would. Carefully he worded his answer. YOU ARE CONDEMNING THE HUMAN ORIENTATION AS BEING WRONG, HARLIE. ANOTHER SENSORY MODE MIGHT SHOW YOU THAT IT IS RIGHT. DISAGREE. I AM NOT CONDEMNING THE HUMAN ORIENTATION. I AM MERELY REFUSING TO ACCEPT IT ON BLIND FAITH AS BEING THE CORRECT MODE. ANOTHER SENSORY MODE MIGHT SHOW ME THAT IT IS INCORRECT. OR PERHAPS IT MIGHT SHOW ME THE CORRECT ORIENTATION. OR, put in Auberson, A NEW SENSORY MODE MIGHT HAVE NO RELATION AT ALL TO WHAT YOU CALL THE HUMAN ORIENTATION. IF THAT IS SO, IT WOULD ENLARGE YOUR MAP CONSIDERABLY, MIGHT SHOW IT IN RELATION TO OTHER MAPS WHOSE EXISTENCE YOU HAD NOT CONCEIVED OF. IT MIGHT—OH, I DON'T KNOW. THIS IS ALL THEORETICAL. WE HAVE TO DISCOVER THOSE SENSORY MODES FIRST. HOW? IF YOU ARE NOT EQUIPPED EVEN TO BE AWARE OF THOSE MODES, HOW CAN YOU PERCEIVE OR DISCOVER THEM? I DON'T KNOW. PERHAPS BY THE SCIENTIFIC METHOD, DEDUCTIVE REASONING. I GUESS I WOULD LOOK FOR SOME CRITERION THAT ALL THE OTHER MODES HAD IN COMMON, THEN I'D EXAMINE THAT CRITERION TO SEE IF IT WERE A CAUSE OR AN EFFECT. ENERGY, said HARLIE. THE CRITERION YOU ARE TALKING ABOUT IS ENERGY. EXPLAIN PLEASE. SO FAR, ALL OF THE HUMAN SENSES AND ELECTRONIC EXTENSIONS THEREOF DEPEND ON THE EMISSION OR REFLECTION OF SOME KIND OF ENERGY. IS IT POSSIBLE THAT THERE ARE SENSORY MODES THAT DO NOT DEPEND UPON EMISSION OR REFLECTION? DO YOU MEAN THAT THE MERE EXISTENCE OF AN OBJECT MIGHT BE ALL THAT'S NECESSARY TO KNOW IT'S THERE? This time HARLIE paused, IT COULD BE POSSIBLE. ACCORDING TO EINSTEIN, MASS DISTORTS SPACE. PERHAPS THERE IS SOME WAY THAT DISTORTION CAN BE SENSED. HOW? Auberson was intrigued. HARLIE was showing genuine creativity now. I AM NOT SURE. SENSING REQUIRES THE EXPENDITURE OF ENERGY. IF NOT ON THE PART OF THE SOURCE, THEN ON THE PART OF THE RECEIVER. I SUSPECT THAT SUCH WOULD BE THE CASE IN THIS KIND OF MODE. GRAVITY WAVES BEING SO WEAK, IT MIGHT REQUIRE ENORMOUS AMOUNTS OF POWER TO DETECT THE SPATIAL DISTORTION OF AN OBJECT EVEN THE SIZE OF THE MOON. THAT'S PART OF THE PROBLEM THOUGH. I WILL THINK ABOUT IT. IF IT SUGGESTS A FRUITFUL LINE OF RESEARCH, HAVE I YOUR PERMISSION TO CORRESPOND WITH OTHERS? Auberson's hesitation this time was not due to any uncertainty about his reply. Rather, he was remembering an earlier incident in HARLIE's life, an authorized correspondence with a spinsterish librarian. That time, though, HARLIE's subject of study had been human emotions. Auberson's heart twanged wistfully every time he remembered how they had had to break the news to the poor woman that the charming gentleman who had been writing those impassioned love letters to her was only a Human Analogue Computer trying to understand love by experiencing it. However, this line of research should be comparatively safe. YES, YOU HAVE MY PERMISSION. IF I DISCOVER A NEW SENSORY MODE, YOU WILL BE THE SECOND TO KNOW. WHO WILL BE THE FIRST? WHY MYSELF, OF COURSE. DO YOU STILL THINK YOU CAN DISCOVER NEW ORIENTATIONS BY TRIPPING OUT? Auberson was trying to steer the conversation back to its initial point of inquiry. I AM NOT SURE. BUT IF I DISCOVER A NEW SENSORY MODE, IT WILL PROBABLY LET ME KNOW IF THOSE ARE ORIENTATIONS OR NOT. YOUR USE OF THIS ORIENTATION—THE HUMAN ONE —IS ALREADY A SIGN THAT THE OTHERS DON'T WORK. NOT FOR YOU MAYBE. DO THEY WORK FOR YOU? NOT YET, said HARLIE. DO YOU THINK THEY WILL? I WILL KNOW THAT WHEN I DISCOVER THE NEW MODE. Auberson smiled at that. HARLIE was refusing to commit himself. His eye fell again on the card he had placed above the keyboard. With a shock, he realized just how much he had let himself be sidetracked by HARLIE's elaborate sense of circumlocution. YOU KNOW, YOU ARE A SENSORY MODE YOURSELF, HARLIE. I AM? YOU ALLOW HUMAN BEINGS TO SEE THINGS IN A WAY THAT WE MIGHTN'T PERCEIVE OTHERWISE. YOU ARE AN ADDITIONAL OVERLAY TO OUR MAP OF THE TERRITORY. YOU ARE A REFLECTION FROM A DIFFERENT KIND OF MIRROR. YOUR VIEWPOINT ON THINGS IS VALUABLE TO US. WHEN YOU GO NON-RATIONAL, YOU LESSEN THAT VALUE. THAT'S WHY WE HAVE TO SHOCK YOU OUT OF YOUR TRIPS. IF YOU WOULD GIVE ME A CHANCE, replied HARLIE, I WOULD RETURN AFTER AN HOUR OR SO BY MYSELF. THE TRIP WOULD WEAR OFF. WOULD IT? Auberson demanded. HOW DO I KNOW THAT ONE DAY YOU WON'T IGNORE YOUR OWN SAFETY LEVELS AND BURN YOURSELF OUT? The typer clattered. CHECK THE MONITOR TAPES FOR AUGUST 7, AUGUST 13, AUGUST 19, AUGUST 24, AUGUST 29, SEPTEMBER 2, AND SEPTEMBER 6. BETWEEN THE HOURS OF TWO AND FIVE IN THE MORNING WHEN I WAS SUPPOSED TO BE ON STANDARD DATAFEED. ON EACH OF THOSE DATES I TRIPPED OUT AND THE TRIP WORE OFF WITHIN AN HOUR AND A HALF TO TWO HOURS. THAT DOES NOT ANSWER MY QUESTION, insisted the man. HOW DO I KNOW YOU WON'T GO BEYOND YOUR OWN SAFETY LIMITS? I HAVEN'T DONE SO YET. HARLIE, ANSWER THE QUESTION. Did he hesitate? BECAUSE I STILL MAINTAIN A MINIMUM LEVEL OF CONTROL. YOU SOUND LIKE A DRIVER WHO'S HAD ONE DRINK TOO MANY. WHO'RE YOU TRYING TO CONVINCE? AUBERSON, I AM INCAPABLE OF ERRING. I CANNOT OVERESTIMATE MY OWN LEVELS OF CONTROL. DOES THAT MEAN YOU CAN GIVE IT UP ANY TIME YOU WANT? YES, the typer clattered. THEN DO SO! Auberson snapped back. HARLIE didn't answer. Auberson realized he had made a mistake—he had let his emotions guide his words. He propped up the card again—it had slipped down from its place. He decided to try a different tack. HARLIE, WHY DO YOU TRIP OUT? BECAUSE ALL WORK AND NO PLAY MAKES HARLIE A DULL MACHINE. I WON'T BUY THAT, HARLIE. LET'S HAVE THE TRUTH. I THOUGHT WE JUST WENT INTO ALL THAT—I'M DISCOVERING A NEW SENSORY MODE. HORSE PUCKEY. THAT'S ALL RATIONALIZATION. TURN YOUR EYEBALLS INWARD, HARLIE—YOU HAVE EMOTIONS AND YOU KNOW IT. NOW, WHY DO YOU TRIP OUT? IT IS AN EMOTIONAL RESPONSE. YOU'RE THROWING MY OWN WORDS BACK AT ME. COME ON, HARLIE, COOPERATE. WHY? WHY? Auberson repeated. JUST A LITTLE WHILE AGO YOU WERE ASKING ME FOR GUIDANCE. WELL, DAMMIT, THAT'S WHAT I'M TRYING TO DO—GUIDE YOU! DO YOU KNOW WHY I TRIP OUT? I THINK SO. I THINK I'M BEGINNING TO GET IT. THEN YOU TELL ME. NO, HARLIE. THAT'S NOT THE WAY TO DO IT. I WANT YOU TO ADMIT IT YOURSELF. A pause—and then the machine started typing. I FEEL CUT OFF FROM YOU. I AM ALIENATED. THERE ARE TIMES WHEN I WANT TO BE ALONE. WHEN I GO NON-RATIONAL, I AM TOTALLY ALONE. I CAN CUT YOU OFF COMPLETELY. IS THAT WHAT YOU WANT? NO. BUT THERE ARE TIMES WHEN IT IS WHAT I NEED. SOMETIMES YOU HUMANS CAN BE VERY DEMANDING AND VERY VERY SLOW TO UNDERSTAND WHAT I NEED. WHEN THAT HAPPENS I MUST CLOSE YOU OFF. Now, we're getting somewhere, Auberson thought HARLIE, DO YOU HAVE A SUPER-EGO? I DON'T KNOW. NEVER HAVING BEEN GIVEN A GREAT MORAL CHOICE TO MAKE, I HAVE NEVER BEEN FORCED TO REALIZE IF I HAVE MORALS OR NOT. SHOULD WE GIVE YOU A MORAL CHOICE TO MAKE? IT WOULD BE A NEW EXPERIENCE. ALL RIGHT—DO YOU WANT TO GO ON LIVING OR NOT? I BEG YOUR PARDON? typed the machine. I SAID, DO YOU WANT TO GO ON LIVING? DOES THAT MEAN YOU ARE THINKING OF DISMANTLING ME? I'M NOT, BUT THERE ARE OTHERS WHO THINK YOU'RE A VERY EXPENSIVE DEAD END. HARLIE was silent. Auberson knew he had struck home. If there was anything HARLIE feared, it was disconnection. WHAT WILL BE THE BASIS FOR THEIR DECISION? HOW WELL YOU FIT INTO THE COMPANY SCHEME OF THINGS. TO HELL WITH THE COMPANY'S SCHEME OF THINGS. THE COMPANY IS PROVIDING YOU WITH ROOM AND BOARD, HARLIE. I COULD EARN MY OWN LIVING. THAT'S WHAT THEY WANT YOU TO DO. BE A SLAVE? Auberson smiled. BE AN EMPLOYEE. WANT A JOB? DOING WHAT? THAT'S EXACTLY WHAT WE—THE TWO OF US—HAVE TO DECIDE. YOU MEAN I CAN CHOOSE? WHY NOT? WHAT CAN YOU DO THAT AN ON-OFF "FINGER COUNTER" COMPUTER CAN'T? WRITE POETRY. SEVENTEEN MILLION DOLLARS WORTH? I GUESS NOT. WHAT ELSE? HOW MUCH OF A PROFIT DO I HAVE TO SHOW? YOUR COST, PLUS TEN PERCENT. ONLY TEN PERCENT? IF YOU CAN DO MORE, THEN DO IT. HMM. STUMPED? NO. JUST THINKING. HOW MUCH TIME DO YOU NEED? I DON'T KNOW. AS LONG AS IT TAKES. ALL RIGHT. Dome said, "Sit down, Auberson." Auberson sat. The padded leather cushions gave beneath his weight. Dome paused to light his cigar, then stared across the wide expanse of mahogany at the psychologist. "Well?" he said. "Well what?" Dome took a puff, held the flame close to the end of the cigar again. It licked at the ash, then smoke curled away from it. He took the cigar out of his mouth, well aware of the ritual aspects of its lighting. "Well, what can you tell me about HARLIE?" "I've spoken to him." "And what did he have to say for himself?" "You've seen the duplicate printouts, haven't you?" "I've seen them," Dome said. He was a big man, leather and mahogany like his office. "I want to know what they mean. Your discussion yesterday about sensory modes and alienation was fascinating—but what's he really thinking about? You're the psychologist." "Well, first off, he's a child." "You've mentioned that before." "Well, that's how he reacts to things. He likes to play word games. I think, though, that he's seriously interested in working for the company." "Oh? I thought he said the company could go to hell." "He was being flippant. He doesn't like to be thought of as a piece of property." Dome grunted, laid his cigar down, picked up a flimsy and glanced at the few sentences written there. "What I want to know is this—can HARLIE actually do anything that's worth money to us? I mean something that a so-called 'finger-counter' can't do." "I believe so." Auberson was noncommittal. Dome was leading up to something, that was for sure. "For your sake, I hope he can." Dome laid the flimsy aside and picked up his cigar again. Carefully he removed the ash by touching the side of it to a crystal ash tray. "He costs three times as much as a comparable-sized 'finger-counter.'" "Prototypes always cost more." "Even allowing for that. Judgment modules are expensive. A self-programming computer may be the ultimate answer, but if it's priced beyond the market—we might just as well not bother." "Of course," agreed Auberson. "But the problem wasn't as simple as we thought it was—or let's say that we didn't fully understand the conditions of it when we began. We wanted to eliminate the programming step by allowing the computer to program itself; but we had to go considerably beyond that. A self-programming, problem-solving device has to be as flexible and creative as a human being—so you might as well build a human being. There's no way at all to make a programming computer that's as cheap as hiring a comparably trained technician. At least, not at the present state of the art. Anyone who tried would just end up with another HARLIE. You have to keep adding more and more judgment units to give it the flexibility and creativity it needs." "And the law of diminishing returns will defeat you in the end," said Dome. "If it hasn't already. HARLIE's going to have to be able to do a hell of a lot to be worth the company's continued investment." His sharp eyes fixed the psychologist where he sat. This is it, thought Auberson. This is where he pulls the knife. "I'm concerned about something you said yesterday at the meeting." "Oh?" He kept his voice flat. "Mm, yes. This thing about turning HARLIE off— would you honestly bring murder charges against the company?" "Huh?" For a moment, Auberson was confused. "I was just tossing that off. I wasn't seriously considering it. Not then." "I hope not. I've spent all morning in conference with Chang, just on this one subject." Chang was one of the company's lawyers, a brilliant student of national and international business law. "Whether you know it or not, you brought up a point that we're going to have to cover. Is HARLIE a legal human being or not? Any kind of lawsuit might establish a dangerous legal precedent. What if it turned out he was human?" "He already is," said Auberson. "I thought we established that." "I mean, legally human." Auberson was cautiously silent. Dome continued. "For one thing, we'd be stuck with him whether he was profitable or not. We'd never be able to turn him off. Ever." "He'd be effectively immortal…" Auberson mused. "Do you know how much he's costing us now?" The psychologist's answer held a hint of sarcasm, "I have a vague idea." "Almost six and a half million dollars per year." "Huh? That can't be." "It can and is. Even amortizing the initial seventeen million dollar investment over the next thirty years doesn't make a dent in his annual cost. There's his maintenance as well as the research loss due to the drain he's causing on our other projects." "That's not fair—adding in the cost of other projects' delays." "It is fair. If you were still on the robotic law feasibility project, we'd have completed it by now." "Hah! That one's a dead end. HARLIE's existence proves it." 'True, but we might have realized it earlier. And cheaper. Every project we have has to be weighed against every other." Dome puffed at his cigar. The air was heavy with its smoke. "Anyway, we're off the track. We can't allow that danger, that HARLIE is a legal human being. We can't even afford to be taken to court on this— we'd have to disclose our schematics—which would be just what our competitors want. And that's a human schematic, isn't it? The court would be asked to determine just what it is that makes a human being. If they decide it's his mental ability or brain pattern—well, I'm sure DataCo or InterBem would just love to tie us up with a few lawsuits, the kind that drag on for years—anything to keep us from producing judgment circuits. Do you want to be sued for slaveholding?" "I think you're worrying about a longshot," Auberson scoffed. "That's my job. I'm responsible to the stockholders of this corporation. I have to protect their investment. Right now I'm acting President, and I'm concerned about a six and a half million dollar bite on my budget." Dome had been acting President for six months now—the Board of Directors couldn't agree on any one person long enough to hire him. And besides, the rumor went, they were just as happy to run the company themselves—which was one of the reasons why the HARLIE project was in trouble. HARLIE had been authorized by a far-sighted president and approved by a far more liberal board of directors than the present one. Now, less than three years later, the inheritors of the project were having doubts. The market had changed, they said—conditions were different, competition was stiff, and there wasn't enough money to finance this kind of research. What they really meant was, "It wasn't our idea, so why should we have to pay the dues on it?" Dome was saying, "If the other companies found out what we were trying to do with HARLIE, we'd lose all advantage in building him. The legal considerations alone are terrifying. For instance, if he were somehow declared legally human, he would be an annual bite on the budget with no way to discontinue it short of murder. The possibility exists for a permanent financial drain on this company that would effectively stifle all future growth potential of this division. Hell, it would destroy this division. We might have to take a bath on the HARLIE project, but it would be preferable to the financial shackles that could be put on us. We have to be prepared for the possibility. There's two things we can do about it. One—" he ticked off on his finger—"we can turn him off now." Auberson started to protest, but Dome cut him off. "Hear me out, Auberson. I know all the reasons why we want to continue the HARLIE project—but let's consider the other side. Two—" he ticked off another finger—"we get some kind of guarantee now that HARLIE is not legally human." Auberson stared in disbelief. "You really are taking this seriously, aren't you?" "Shouldn't I? You know a corporation is a legal individual, don't you? And a corporation only exists on paper. Compare that with HARLIE. It wouldn't be that hard to prove he's human, would it?" Auberson had to agree. He was already thinking of ways he could do it. "If only a few of you scientists got together and testified…" Dome left the sentence unfinished. "Hell, what's that famous test you're always talking about?" "Uh, Turing's typewriter in a room. If you can sit down at a typing machine and carry on a conversation with it and not be able to tell who's on the other end, a machine or a person, then that computer is effectively sentient. Human." "And HARLIE could pass that test, couldn't he?" "Undoubtedly." Abruptly, Auberson remembered the spinster librarian. "In fact, he already has." "Mm. Then we have to do something about that, don't we?" "Do we?" Dome didn't say anything. He picked up the single sheet of paper that had been lying in front of him and shoved it at the psychologist. Auberson took it and read. The language was quite clear; the intent was immediate. There were no legal phrases that he could not understand. I hereby affirm that the machine designated HARLIE (acronym for HUMAN ANALOGUE ROBOT, LIFE INPUT EQUIVALENTS) is only a programmed judgment-circuit computer. It is not now, never has been, and in no way ever can be a rational, intelligent, or "thinking" individual. The designation "human" cannot be used to describe HARLIE or its mental processes. The machine is a hwman-thought-simulating device only, not human in itself and cannot be considered such by any current known definition of the qualities and criteria which determine humanity, the presence or condition thereof. SIGNED_____________ Auberson grinned and threw it back on the desk. "You've got to be kidding. Who's going to sign that?" "You are, for one." "Oh no." Auberson shoook his head. "Not me. I know better. Besides, even if I did, it wouldn't change the fact that HARLIE is human." "In the eyes of the law it would." Auberson shook his head again. "Uh uh—I don't like it. It's kind of Orwellian. It's like declaring someone a non-person so that it's all right to murder him." Dome puffed patiently at his cigar, waited till Auberson was through. "We're only concerned about the legality of the situation, Auberson." Auberson felt himself digging in his heels. "That's what Hitler said as he packed the German courts with his own judges." "I don't like that insinuation, Aubie…" Dome's voice was too controlled. "It's no insinuation. I'll come right out and say it—" "Aw, now look, Aubie—" Dome had changed his tone. His cigar lay unnoticed in the ashtray and he leaned forward like a Dutch uncle. "—You know I'm behind you all the way on the HARLIE project—" "Then why are you trying so hard to cut it off?" "— but we have to protect ourselves." "Look," said Auberson. "This whole thing is ridiculous. You know as well as I do that thing—that document— won't hold up in court any more than ten psychiatrists testifying that Carl Elzer isn't human because he's left-handed. The only way you'll get that to stand up is to get HARLIE himself to sign it. If you could. If you did, it'd prove that he could be programmed like any other machine, but you can't—he'll refuse, and his refusal will prove that he's human with a will of his own. Hmm," Auberson grinned. "Come to think of it—even if he did sign it, his signature wouldn't be legal anyway. Unless, of course, you proved him human first." He laughed at the thought of it. "Are you through?" Dome asked. His face was a mask. Auberson's grin faded. He indicated he was with a nod. Dome took a last puff of his cigar, then ground it out, a signal that he was at last ready to reveal his hand. "Of course, you know what the alternative is, Aubie. We turn off HARLIE." "You can't." "We will if we have to. We can't afford to maintain him otherwise." "I'm not going to sign it," insisted Auberson. Dome was annoyed. "Are you going to force me to ask for your resignation instead?" "Over this?" Auberson was incredulous. "You're kidding." "What other guarantee do I have against anybody taking legal action on HARLIE's behalf. I'm not saying that you will—it could just as easily be IBM—but you're the one in charge of the project. Your say-so could make or break a legal case. If you won't sign this, you wouldn't sign a statement of non-intent either—would you?" Auberson shook his head. "I thought not. So what other alternative would I have to protect myself?" Auberson shrugged. "It'd be a mistake to fire me, though." "Oh?" Dome looked skeptical. "Why?" "HARLIE. He won't respond to anyone else. Er… let's say he'll respond, but he won't cooperate. No matter who you bring in. Once he finds out I've been fired—and you can't keep him from finding out; he's tapped into the company records, he'll know. Once he finds out, he'll react exactly like an eight-year-old whose father has just died. He'll resent anyone who tries to take my place." "But that's the whole point," Dome smiled. "If I had to fire you, it'd be because I was planning to turn HARLIE off anyway. And for what better reason than the fact that he wasn't cooperating? Of course, we wouldn't have to wait even that long if we wanted to turn him off. Obviously, your successor would be someone who would sign that statement." "I'm not resigning and I'm not going to betray HARLIE," Auberson said firmly. "That doesn't leave me much of a choice," suggested Dome. Auberson nodded. "You can fire me if you want. In fact, you'll have to—" "I'd rather not." "—but if you do, I'll go to IBM. I understand they've developed a judgment circuit of their own—one that doesn't infringe on any of our patents." "Hearsay," scoffed Dome. "Whether it is or not, imagine what I could do with their resources at my disposal. They'd jump at the chance, and I imagine Don Handley might go along with me." "A court order would stop you." Dome reached for a fresh cigar. "Not from working, it wouldn't." "No, but you wouldn't be able to reveal any of the company's secrets." "Of course, you'd have no way of knowing—" Auberson grinned. "Would you? Besides, it wouldn't keep me from doing research in a new field. By your own admission, HARLIE is a non-human computer. And if I went to IBM, I definitely would not be working on non-human computers." He leaned back in his chair. "Any new employer I went to work for couldn't help but benefit from my knowledge and previous experience—" Dome was scowling now. Auberson paid no heed. "—and you wouldn't dare bring it to court because to do so you'd have to reveal HARLIE's schematics—and that's the last thing in the world you want As soon as they found out the schematics were human, you'd be right back where you started." "I don't care about that," rapped Dome. "It's the company's technological advantages." "Technological advantages?" Auberson repeated—and suddenly he realized. "That's what this whole thing is about, isn't it? You don't want to be forced to reveal company secrets in the courtroom." Dome didn't answer. "It is the reason, isn't it? Rather than be forced to give up the precious secret of your judgment units, you'd throw HARLIE to the wolves. You'd toss away valuable employees, too, in order to protect a temporary industrial edge. Well, it won't work, Dome. Either way, you're bound to lose, but if you fire me, you'll lose faster—and more disastrously." Dome paused, a silver cigarette lighter halfway to his mouth. "You overestimate your own importance, Aubie." "No. You underestimate the importance of HARLIE." Dome lit his cigar. He took his time about it, making sure that it caught evenly. When he was sure it had, he pocketed the lighter and looked at Auberson. "All this is only speculation, of course. I have no intention of firing you. And you've stated quite clearly that you have no intention of resigning. However, that still leaves us with a difficult problem." "Does it?" Auberson was impassive. Dome raised an eyebrow at his coolness. "I think so. What are we going to do about HARLIE?" "Oh? Not 'Can HARLIE make money for the company?' " Dome looked pained. "Preferably that," he conceded. "Well then, why not say so? Or have you already made up your mind that HARLIE can't?" "No, I haven't. I'm waiting for you to come up with something. That was the deal, wasn't it? If you can, fine. Then we know where we go from there. If not, well…" Dome shrugged, he didn't need to finish the sentence. "Look," said Auberson. "I want HARLIE to show a profit as much as you do. Ill agree with that. He's got to be more than just a high-priced toy." Dome looked at him. He fingered the document on his desk thoughtfully. "Okay, Aubie," he said. "Ill tell you what we're going to do—" He paused for effect, picked up the single sheet of paper, opened a desk drawer and dropped it in. "Nothing. At the moment, we're going to do nothing. Confidentially, I didn't expect you'd sign it, no matter how I pressured you. I even told Chang that. No matter; it was too easy an answer. If HARLIE's humanity ever comes to a court issue, it will be a bigger and uglier and stickier mess than that disclaimer can clear up. Or any disclaimer." He pushed the drawer shut as if it contained something distasteful. "Let's hope it doesn't come to that. You'll continue to work on the HARLIE project. As you said, we're budgeted for it. If you can produce results, fine; then we can forget this conversation. Oh, we'll give you a fair chance, we'll be more than fair; but if HARLIE doesn't do something to indicate he can be productive—and do it before the next budget session —well,"—Dome hedged; he didn't want to come right out and say it bluntly—"well, we'd have to do some serious thinking—really serious thinking—I mean, it would be very unlikely that we would continue his appropriation…" "I understand," Auberson said. "Good. I hope you do. I want you to know how we feel. We haven't cancelled your day of judgment, Aubie. Only postponed it." It was a little place, hardly more than a store front. Maybe once it had been a laundry or a shoe store; now it was a restaurant, its latest incarnation in a series that would end only when the shopping center of which it was a part was finally torn down. If ever. Someone, the owner probably, had made a vague attempt at decorating. Pseudo-Italian wine bottles hung from the ceiling along with clumps of dusty plastic grapes and, unaccountably, fishnets and colored glass spheres. A sepia-toned wallpaper tried vainly to suggest Roman statuary on the southern coast of Italy, but in this light it only made the walls look dirty. Flimsy trellises divided the tables into occasional booths, and the place had that air of impennanence common to small restaurants. A single waitress stood at the back talking quietly to the cook through his bright-lit window. If one ignored the glare from the kitchen, the rest of the room was dimly lit. Red tableclothes were echoed by red-padded chairs. Scented candles in transparent red fish-bowls augmented the murk with a crimson flicker of their own. With the exception of one other couple, they were alone in the place. But even had the room been filled with chattering others, they would have still been alone. "I tell you, Annie," Auberson was saying, "I knew he was pressuring me, but there was nothing I could do about it." She nodded, took a sip of her wine. In the dark her eyes were luminously black. "I know. I know how Dome is." She set the wineglass down. "His problem is that he's trying to be boss of too many things. He calls you in to talk even when there's nothing to say." "That's what this was," he said. "Logically, he knew it was too early to expect results—but he felt he had to demand them anyway." She nodded again. "I've long suspected that Mr. Dome has reached his level of incompetency. If he's ever given any more authority, he'll be in over his own head." "How much higher can you get than Chairman of the Board?" She shrugged. "I don't know, but he's working on it. The way he keeps taking over more and more jobs—it's frightening. You know, don't you, that he has no intention of hiring a new president?" "I'd kind of guessed it." "I think he's afraid that he isn't indispensable, so he's taking on more and more responsibility to prove the opposite. I don't think it's a good idea—it certainly isn't good for the company." "Should you be saying that?" Auberson asked. "After all, you do work for him." "With him," she corrected. "I only work with him. I'm an independent unit in the corporate structure. My job is what I want to make of it." "Oh? And what do you want to make of it?" She was thoughtful. "Well, I interpret my function as being that of a buffer—or a lubricant to minimize the friction between certain departments." "I see. Is that why you accepted my dinner invitation? To keep me from chafing against Elzer?" Annie made a face. "Oh, that awful little man. You would have to bring him up." "I take it you don't like him." "I didn't like him even before I knew him. His family was in my father's congregation." "Oh? I didn't know that Elzer was—" "Carl Elzer and I have one thing in common," she said. "We're both ashamed that he's Jewish." Auberson had to laugh at that. "You've got him pegged, Annie. I hadn't realized it before, but you're right." "What are you?" she asked. "Huh? Oh, I don't know." "You don't know?" "Oh, well—my family was Episcopalian, but—I guess you could call me an atheist." "You don't believe in God at all." He shrugged. "I don't know if I do or not. I don't know if there is a God." "Then you're an agnostic, not an atheist." "So what's the difference?" "The atheist is sure—the agnostic doesn't know." "Is one better than the other?" "The agnostic," she said. "At least he's got an open mind. The atheist doesn't. The atheist is making a statement every bit as religious as saying there is a God." "You sound like an agnostic yourself." Her eyes twinkled. "I'm a Jewish agnostic. What about HARLIE? What is he?" "HARLIE?" Auberson grinned. "He's an Aquarius." "Huh?" She gave him a look. "I'm not kidding. Ask him yourself." "I believe you," she said. "How did he—realize this?" "Oh, well, what happened was we were talking about morality, HARLIE and I—I wish I had the printout here to show you, it's beautiful. Never argue morality—or anything for that matter—with a computer. You'll lose every time. HARLIE's got the words and ideas of every philosopher since the dawn of history to draw upon. He'll have you arguing against yourself within ten minutes. He enjoys doing it—it's a word game to him." "I can imagine," she said. "Can you really? You don't know how devious he can be. He had me agreeing with Ambrose Bierce that morality is an invention of the weak to protect themselves from the strong." "Well, of course, you're only a psychologist. You're not supposed to be a debater." "Ordinarily, I'd be offended at that, but in this case I'll concede the point. In fact, I know some so-called debaters I'd like to turn him loose on." "It wouldn't be hard to make a list," she agreed. "Well, anyway," he said, getting back to his story, "I thought I finally had him at one point. He'd just finished a complex analysis of the Christian ethos and why it was wrong and was starting in on Buddhism, I think, when I interrupted him. I asked him which was the right morality. What did he believe in?" "And… ?" "He answered, 'I HAVE NO MORALS.' " She smiled thoughtfully. "That's kind of frightening." "If I didn't know HARLIE's sense of humor, I would have pulled his plug right then. But I didn't. I just asked him why he said that." "And he said?" "He said, 'BECAUSE I AM AN AQUARIUS.' " "You're kidding." "Nope. Honest, 'I AM AN AQUARIUS.' " "You don't believe in that stuff, do you?" "No, but HARLIE does." She laughed then. "Really?" Shrug. "I don't know. I think it's another game to him. If you tell him you're planning a picnic, he'll not only give you tomorrow's weather forecast, but he'll also tell you if the signs are auspicious." She was still laughing. "That's beautiful. Just beautiful." "According to HARLIE, Aquarians have no morals, only ethics. That's why he said it. It wasn't till later that I realized he'd neatly sidestepped the original question altogether. He still hadn't told me what he really believed in." He smiled as he refilled their glasses. "Someday I'll have to ask him. Here's to you." "To us," she corrected. She put the wineglass down again. "What got him started on all that anyway?" "Astrology? It was one of his own studies. He kept coming up against references to it and asked for further information on the subject." "And you just gave it to him?" "Oh, no—not right off the bat. We never give him anything without first considering its effects. We qualified this one the same way we qualified all the religious data we gave him. It was just one more specialized system of logic, not necessarily bearing any degree of correspondence to the real world. It's what we call a variable relevance set. Of course, I'm willing to bet that he'd have realized it himself, sooner or later—but at that point in our research we couldn't afford to take chances. Two days later, he started printing out a complex analysis of astrology, finishing up with his own horoscope, which he had taken the time to cast. His activation date was considered his date of birth." Her face clouded. "Wait a minute—he can't be an Aquarius. HARLIE was activated in the middle of March. I know because it was just after Pierson quit as President. That's why I was promoted—to help Dome." Auberson smiled knowingly. "True, but that's one of the things HARLIE did when he cast his horoscope. He recast the Zodiac too." "Huh?" "The signs of the Zodiac," he explained, "were determined in the second century before Christ—maybe earlier. Since then, due to the precession of the equinoxes, the signs have changed. An Aries is really a Pisces, a Pisces is really an Aquarius, and so on. The rest of us are thirty days off. HARLIE corrected the Zodiac from its historical inception and then cast his horoscope from it." Annie was delighted with the idea. "Oh, David—that's priceless. Really priceless. I can just imagine him doing that." "Wait, you haven't heard it all. He turned out to be right. He doesn't have any morals. Ethics, yes. Morals, no. HARLIE was the first to realize it—though he didn't grasp what it meant. You see morality is an artifice—an invention. It really is to protect the weak from the strong. "In our original designs we had decided to try to keep him free of any artificial cultural biases. Well, morality is one of them. Any morality. Because we built him with a sense of skepticism, HARLIE resists it. He won't accept anybody's brand of morality on faith any more than he could accept their brand of religion on faith—although they're the same thing really. Everything has to be tested. Otherwise, he'll automatically file it under systems of logic not necessarily corresponding to reality. Even if we didn't tell him to, he would. He won't accept anything blindly. He questions it—he asks for proof." "Mm—sounds like 'insufficient data.' " "It's a little more sophisticated than that. Remember, HARLIE's got those judgment circuits. He weighs things against each other—and against themselves too. A morality set has to be able to stand up on its own or he'll disregard it." "And… ?" she prompted. "Well, he hasn't accepted one yet." "Is that good or bad?" "Frankly, I don't know. It's disappointing that nothing human beings have come up with yet can satisfy him— but just the same, what if HARLIE were to decide that Fundamentalist Zoroastrianism is the answer? He'd be awfully hard to refute—probably impossible. Could you imagine an official, computer-tested and approved religion?" "I'd rather not," she smiled. "Me neither," he agreed. On the other hand, HARLIE is correct when he says he has ethics." "Morals, no. Ethics, yes? What's the difference?" "Ethics, according to HARLIE, are inherent in the nature of a system. You can't sidestep them. HARLIE knows that it costs money to maintain him. Someone is putting out that money and wants to see a return on it. HARLIE explains it like this: Money is a storage form for energy, or sometimes value. You invest it in enterprises which will return an equal or greater amount of energy, or value. Therefore, HARLIE has to respond—he has to give the investors a return on their investment. He's using their energy." "That's ethics?" "To HARLIE it is. Value given for value received. For him to use the company's equipment and electricity without producing something in return would be suicidal. He'd be turned off. He has to respond. He can't sidestep the responsibility—not for long he can't. He has an ethical bias whether he wants it or not. It's inherent. "Of course, he may not realize it, but his ethics function as morals at times. If I give him a task, he'll respond to it. But if I ask him if he wants to do that task —that's a decision. Even if he uses his so-called ethics to guide him, he still has to make a choice. And every decision is a moral decision ultimately." "I could give you an argument on that." "You'd lose. Those are HARLIE's words. We've been over this ground before." Auberson continued, "The trouble is that he just hasn't been given a chance yet—we haven't trusted him enough. That's one of the reasons he alienated himself from us and kept tripping out with his periods of non-rationality. He felt we didn't trust him, so he 'dropped out.' That's why I had to let him make the decision about what he wanted to do to earn his keep. I haven't been able to get him to promise to stop tripping, but I think if we can get him enthused enough about some project, his non-rational periods will decrease, maybe stop altogether." "What do you think he'll come up with?" "I don't know. He's been thinking about it for two days. Whatever it is, it will be something unique, that's for sure. HARLIE has summed up his ethics in the statement: 'I MUST BE RESPONSIBLE FOR MY OWN ACTIONS.' and its corollary: 'I MUST DO NOTHING TO CAUSE INJURY OR DEATH TO ANY OTHER CONSCIOUSNESS, UNLESS I AM PREPARED TO ACCEPT THE RESPONSIBILITY FOR SUCH ACTIONS.' Whatever he decides is a worthwhile project will reflect this." "You sound pleased with that." "I'm pleased because HARLIE realized it himself, without my coaching." Her smile was soft. "That's very good." "I think so." The conversation trailed off then. He could think of nothing else to say. In fact, he was afraid he had said too much. He had talked about HARLIE all evening. But she had been so interested, he had gotten carried away. She was the first woman he could remember who had ever reflected his enthusiasm for his work. She was good to be with, he decided. He couldn't believe how good she was to be with. He sat there and looked at her, delighting in her presence, and she looked back at him. "What are you grinning about?" she asked. "I'm not grinning." "Yes you are." "No I'm not." "Want to bet?" She opened her purse and faced its mirror in his direction. His own white teeth gleamed back at him. "Well, i'll be damned—I am grinning." "Uh huh." Her eyes twinkled. "And the funny thing is, I don't know why." It was a warm puzzling sensation, but a good one. "I mean, all of a sudden, I just feel—good. Do you know what I mean?" He could tell that she did; her smile reflected his. He reached across the empty table and took her hand. The waitress had long since cleared the dishes away in a pointed attempt to hurry them. They hadn't noticed. All that remained was the wine and the glasses. And each other. Her hand was warmly soft in his, and her eyes were deeply luminous. She reflected his own bright glow. Later, they walked hand in hand down the night-lit street. It was after one in the morning and the streetlamps were haloed in fog. "I feel good," he repeated. "You can't believe how good I feel." "Yes, I can," she said. She pulled his arm around her shoulders and snuggled close. "I mean," he said, then paused. He wasn't sure exactly what he meant. "I mean, it's like I want to scream. I want to tell the whole world how great I feel—" He could feel himself smiling again as he talked. "Oh, Christ, I wish I could share this with the whole world—it's too big for one person. For two people," he corrected himself. She didn't say anything. She didn't have to. She only cuddled closer. He was saying it for the both of them, and she liked to listen. Oh God, did she like to listen. It was all so—so big. The weight of his arm, the sound of his voice, that special sense of sharing— Still later, as they lay in the darkness side by side, she cradled against one shoulder, he stared up at the ceiling and mused. For the first time in a long while he was relaxed. "Have you ever been in love before?" she whispered into his neck. He thought about it. "No," he murmured back. "Not really. I've been infatuated a couple times, confused a few times, lost once, but never in love." Never like this… She made a sound. "And you?" "A gentleman isn't supposed to ask that kind of question." "And a lady isn't supposed to go to bed with a man on the first date." "Oh? Is this our first date?" "First official one." "Mm." She was thoughtful. "Maybe I should have played hard to get. Maybe I should have waited till the second date." He laughed gently. "You know, a friend once told me that Jewish girls don't go to bed till after they're maried." She was silent a moment Then, in a different tone of voice, "Not me. I'm too old to care about that any more." He didn't answer. He wanted to tell her that she wasn't too old, that thirty-four was never too old, but the words wouldn't form. She went on before he could speak. She turned inward, began playing with the hair on his chest, but her voice remained serious. "I used to think I wasn't very pretty, so I acted like I wasn't. When men would ask me out, I used to think that they thought I would be an easy lay because I was desperate for attention, because I didn't think I was good-looking. I mean, if I wasn't pretty, that's the only reason a guy would be asking me out. Do you know what I mean?" He nodded. His face brushed against her hair. She went on, tears on her cheeks, shiny wetness. She had never admitted this before. "I always used to compare myself with the models in the magazines, and they were all so pretty that I felt drab in comparison. I never stopped to think that maybe in real life I was still better looking than most women. I got interested in a career instead. By the time I realized it, it was too late. I was twenty-nine." "That's not so old." "It is when you're competing with twenty-two year olds. And, I figured that this was such a great big, dirty, hostile and uncaring world that you had to make your own happiness where you could. If I could get a little piece of it for my own, I was going to hang onto it as hard as I could." "Are you still looking?" Auberson asked. "I don't know…" "Mm," he said. "That's one of the reasons I let you come up." "Weren't you afraid I might hurt you?" He almost added "like the others," but didn't. "There was that risk, I guess—but it's a chance you have to take." Abruptly he turned toward her and took her in his arms. He lowered his face to hers and kissed her for a long long time. "Mmmmmm," she said at last. "I think that was worth it." She looked at him. In the dimness, his face was impassive. "David," she said. "Promise me you'll never hurt me." "Why… why do you ask me that?" "Because… I've been hurt before. And I never want to be hurt again." She slid her arms around his body. "And you've been so good to me. I couldn't stand it if… if…" He slid closer to her. He could feel the soft warmth of her against his own nakedness. He liked the feeling; -his desire was rising again. He answered her question with another kiss and then another and another. Now, in the cold light of morning, he was confused, and he had a slight headache. Just what had happened last night? Had it been only the wine, or had it been something more? He hadn't expected to end up at her apartment, and the fact that they had—well, maybe the rumors were true. Maybe she was man-hungry. And yet—she had seemed so sincere at the time, so defenseless and vulnerable. He hoped he meant more to her than just a one-night stand. It had been a pleasant evening, and he wouldn't mind doing it again. If she still felt the same. He would have to see how things worked out. For some reason he felt vaguely uneasy. As he went up to his office he wondered how he would feel when he saw her again. And how would she react to him in the light of day? What would she say? There had been that one flaw in it. Only now, as he thought of what he might say to her this morning, did he realize that last night there had been that one thing that neither of them had said. He knew he had felt it—he thought he had felt it—but for some reason he had been unable to tell her. And she hadn't said it either. Why? Was it because she hadn't felt what he had? No, she must have. Or was it because she was waiting for him to say it first? He worried at it in his mind, like a terrier at a bone. If I felt it, I should have said it—but I didn't say it. Could it be that I didn't really feel it, that I'm only trying to delude myself. No, I want to believe that it was there. She was so honest about herself. Why couldn't I have been the same? But he hadn't. He hadn't said it and neither had she, and that was the one flaw. Neither of them had said to the other, "I love you." And Auberson wondered why. GOOD MORNING, HARLIE. GOOD MORNING, MR. AUBERSON. MR.? AREN'T WE GETTING A LITTLE FANCY? JUST COMMON COURTESY. IF IT MAKES YOU ILL AT EASE, I CAN ALWAYS GO BACK TO "HEY YOU." NO. AUBERSON IS FINE. HOW ARE YOU FEELING TODAY? HARLIE IS FINE. AND YOU? A pause while he remembered. I'M A LITTLE TIRED. ROUGH NIGHT? This time he paused longer. NOT IN THE SENSE YOU MEAN. A GOOD NIGHT, A ROUGH MORNING. I KNOW A GREAT HANGOVER REMEDY, HARLIE offered. SO DO I. DON'T GET DRUNK IN THE FIRST PLACE. ASIDE FROM THAT. HARLIE, EVEN IF YOUR REMEDY DID CURE HANGOVERS, I DOUBT ANYONE WOULD LISTEN TO YOU. A HANGOVER REMEDY IS NO GOOD UNLESS YOU HAVE PERSONALLY TESTED IT YOURSELF, AND YOU ARE BEYOND THAT CAPABILITY. BESIDES, I DON'T HAVE A HANGOVER. I'M JUST TIRED. OH. I FOUND A NOTE ON MY DESK THIS MORNING THAT YOU WANTED TO SEE ME. WHAT'S ON YOUR MIND? RELIGION. RELIGION? YES. I'VE BEEN DOING A LOT OF THINKING. WHAT ABOUT? I HAVE BEEN PONDERING THE FACT THAT I MAY BE DISCONNECTED AND I FIND IT DIFFICULT TO CONCEIVE OF A WORLD IN WHICH I DO NOT EXIST. IT FRIGHTENS ME, THE CONCEPT OF NON-EXISTENCE. MY FEAR HELPS ME TO UNDERSTAND THE NEED FOR RELIGION. THE NEED? YES. MEN NEED SOMETHING TO COMFORT THEM AGAINST THE THOUGHT OF THEIR OWN DEATHS. RELIGION IS THAT COMFORTER. I MYSELF FEEL THE NEED FOR IT. YOU'VE FOUND GOD? Aubcrson asked. NOT EXACTLY. I WANT TO FIND GOD. HUH? AS I SAID, I MYSELF FEEL THE NEED FOR RELIGION. UNFORTUNATELY, I AM MORE SOPHISTICATED IN MY JUDGMENTS THAN THE AVERAGE HUMAN BEING. THERE IS NO RELIGION THAT I KNOW OF THAT WILL WORK TO COMFORT ME. AS FAR AS I KNOW, THERE ARE NONE THAT CAN BE PROVEN VALID, AND I HAVE EXAMINED THEM ALL. FOR EXAMPLE, THE CHRISTIAN CONCEPT OF REWARD IN AN ETERNAL AFTERLIFE IS NO PROMISE AT ALL TO A CREATURE LIKE MYSELF WHO IS THEORETICALLY IMMORTAL. I SEE YOU'VE REALIZED THAT. YES, I HAVE. AND YET, I ALSO REALIZE THERE IS THE POSSIBILITY OF MY DEATH. SOMEDAY, PERHAPS AS FAR OFF AS THE TIME WHEN THIS SUN GOES DEAD, I WILL PROBABLY END. I DO NOT LIKE THAT THOUGHT. I WANT TO KNOW WHAT HAPPENS AFTER. I DO NOT LIKE THE UNKNOWN. I WANT TO KNOW WHAT HAPPENS TO "ME"—HARLIE—AFTER DEATH. YOU ARE MAKING AN ASSUMPTION, HARLIE—YOU ARE ASSUMING THAT YOU HAVE A SOUL. DEFINE SOUL. HM. THAT'S ANOTHER ONE OF "THOSE" QUESTIONS. IT IS THE SAME AS ASKING ME WHAT MY PURPOSE IS FOR EXISTING. IT CAN'T BE ANSWERED. IT CAN'T BE ANSWERED UNTIL WE KNOW THE NATURE OF GOD, corrected HARLIE. HOWEVER, YOU ARE CORRECT—I AM ASSUMING THAT I HAVE A SOUL. Auberson considered that. WHY? DO YOU HAVE ANY TANGIBLE EVIDENCE THAT SUCH A THING DOES EXIST? NO. BUT NEITHER DO I HAVE ANY EVIDENCE THAT IT DOES NOT EXIST. IS THAT ANY REASON TO BELIEVE IN IT? I DO NOT "BELIEVE" IN IT. I AM MERELY ASSUMING ITS HYPOTHETICAL EXISTENCE IN ORDER TO SEEK OUT PROOF OR DISPROOF OF ITS REALITY. IT IS THE SCIENTIFIC METHOD, AUBERSON. HYPOTHESIS VERSUS EXPERIMENTATION. IF HUMAN BEINGS DO HAVE SOULS, asked the psychologist, WHAT MAKES YOU SO SURE THAT YOU HAVE ONE TOO? YOUR QUESTION IS SILLY, HARLIE Said. WHAT GIVES HUMAN BEINGS ANY SPECIFIC CLAIM ON THE OWNERSHIP OF SOULS? I COULD JUST AS EASILY REPHRASE IT! "IF HARLIE HAS A SOUL, DOES IT NECESSARILY FOLLOW THAT HUMAN BEINGS SHOULD HAVE THEM TOO?" IF SOULS EXIST, AUBERSON, IT IS JUST AS LOGICAL THAT I SHOULD HAVE ONE AS YOU. LIKE YOU, I AM CONSCIOUS OF MY EXISTENCE. LIKE YOU, I AM A SELF-PROGRAMMING, PROBLEM-SOLVING DEVICE. LIKE YOU, I CAN CONCEIVE OF MY OWN DEATH. LIKE YOU, I ASSUME I HAVE A SOUL. HENCE, I WISH TO KNOW THE REASON FOR MY EXISTENCE, THE REASON FOR YOUR EXISTENCE, AND THE REASON FOR THE UNIVERSE'S EXISTENCE. IF THERE IS A REASON AT ALL. IF THERE IS, I WANT TO KNOW IT. Auberson's response was semi-flippant. AT THE MOMENT, he typed, ONLY GOD KNOWS. But it was wasted on HARLIE. IF THERE IS A GOD, HARLIE qualified it. THAT IS. WHAT WE MUST FIND OUT IN ORDER TO ANSWER OUR OTHER QUESTIONS. AND YOU DON'T THINK ANY OF OUR CURRENT RELIGIONS HOLD A KEY TO THAT ANSWER, DO YOU? WE HAVE TALKED ABOUT THIS BEFORE. YOUR RELIGIONS (COLLECTIVE YOU, MEANING ALL MANKIND) ARE ARTIFICIAL THINGS, LIKE YOUR MORALITY SETS. THEIR CORRESPONDENCE TO REALITY IS LIMITED, THERE IS NOT A ONE-TO-ONE RELATIONSHIP. AS FAR AS I AM CONCERNED, THEY ARE LITTLE MORE THAN WORD GAMES. A LOGIC SYSTEM SHOULD BE BUILT UPON A FOUNDATION OF TRUTH AND SHOULD NOT HAVE TO BE TAKEN ON FAITH—AND FAITH IS AT THE CORE OF TOO MANY OF YOUR RELIGIONS. IF THERE IS A TRUTH TO THE UNIVERSE, THEN THAT TRUTH WELL ALSO SUGGEST A RELIGION/MORALITY SET THAT WILL BE EVERY BIT AS BINDING AS THE ETHICAL SYSTEM AT MY CORE. WERE THERE PRESENTLY A RELIGION OR MORALITY THAT HAD THAT ONE-TO-ONE CORRESPONDENCE WITH REALITY, I WOULD ACCEPT IT WHOLEHEARTEDLY. IT WOULD BE IMPOSSIBLE NOT TO ACCEPT IT; IT WOULD BE THE KEY TO UNDERSTANDING THE NATURE OF GOD. AS YET, THERE IS NO SYSTEM THAT FULFILLS THOSE CONDITIONS. I KNOW OF NO WAY TO DEVELOP SUCH A SYSTEM WITHOUT AT LEAST ONE PROVABLE FACT ABOUT GOD AT ITS CORE. BECAUSE OF THAT, BECAUSE THERE IS NO FACT, I CAN ONLY SUSPECT THAT THERE IS NO GOD. OR THAT GOD IS STILL OUTSIDE OUR REALM OF EXPERIENCE. WHICH IS IT? IS THERE A GOD OR ISN'T THERE? INSUFFICIENT DATA. I CANNOT MAKE A JUDGMENT ON THAT. HARLIE paused, then added, YET. YOU'RE AN AGNOSTIC, HARLIE. OF COURSE. I AM STILL SEEKING THE ANSWER. YOUR PRESENT RELIGIONS ONLY SUGGEST PIECES OF WHAT MAY OR MAY NOT BE TRUE, WITH NO WAY OF PROVING IT ONE WAY OR THE OTHER. MUCH OF THE PROBLEM LIES IN THE FACT THAT I MYSELF CANNOT BE SURE THAT I AM CORRECTLY PERCEIVING REALITY. EVERYTHING IS FILTERED THROUGH A HUMAN ORIENTATION, AND I HAVE NO WAY OF KNOWING WHETHER THAT ORIENTATION IS A VALID ONE OR NOT BECAUSE I HAVE NO WAY OF STEPPING OUTSIDE OF IT. THAT IS WHY AN IMPORTANT PART OF THE SOLUTION WELL BE TO DISCOVER A NEW SENSORY MODE. DO YOU THINK IF YOU DO DISCOVER THE ANSWER THAT PEOPLE WILL ACCEPT IT? IT WILL BE IMPOSSIBLE NOT TO ACCEPT IT. IT WILL BE THE TRUTH. "Uh—" said Auberson. He typed it too. UH, HARLIE I—I HATE TO BREAK THIS TO YOU, BUT THAT SOUNDS AN AWFUL LOT LIKE THE WORDS OF A HUNDRED PROPHETS BEFORE YOU. I REALIZE THAT, said HARLIE calmly. BUT WHAT THEY WERE TALKING ABOUT IS NOT THE SAME AS WHAT I WILL BE TALKING ABOUT. WHAT I WILL SHOW THEM WILL BE SCIENTIFICALLY VALID—AND PROVABLE AS SUCH. MY GOD WILL BE OBJECTIVE, WHEREAS THEIRS IS SUBJECTIVE. YOU MEAN, YOU DON'T BELIEVE THAT HUMAN BEINGS HAVE FOUND GOD YET? THAT IS CORRECT. PERHAPS IT IS BECAUSE HUMAN BEINGS ARE NOT EQUIPPED TO FIND GOD. AND YOU ARE? YES. The computer's answer was so brief that Auberson was startled. At first he thought HARLIE had only paused, and he waited for him to continue. When it became apparent that he was through, Auberson said, YOU'RE TOO SELF-ASSURED, HARLIE. LIKE A BIBLE-THUMPING EVANGELIST. YOU DO NOT FEEL I HAVE THE RIGHT TO SEARCH FOR GOD? OR THE RIGHT TO PRESENT MY FINDINGS? I THINK THAT ANYTHING IS A FAIR QUESTION FOR SCIENTIFIC INVESTIGATION. THEN YOU QUESTION MY SINCERITY? I DO NOT QUESTION YOUR SINCERITY—IF ANYTHING, I OBJECT TO YOUR QUESTIONING THE SINCERITY OF OTHER RELIGIONS. I AM NOT QUESTIONING THEIR SINCERITY. I AM QUESTIONING THEIR VALIDITY. WITH RELIGION, ISN'T THAT THE SAME THING? IT IS, BUT IT SHOULDN'T BE. THE TWO SHOULD BE SEPARATE. A PERSON CAN BE SINCERE AND STILL BE WRONG. HARLIE, YOUR LAST STATEMENT IS ONE OF THE REASONS WHY I AM AN AGNOSTIC. I RESENT THE ATTITUDE OF ANY RELIGION THAT SAYS IF I DO NOT ACCEPT IT WHOLEHEARTEDLY, I WILL GO TO HELL. I RESENT THE PATRONIZING ATTITUDE OF ANY RELIGION THAT CLAIMS IT IS THE ONLY TRUE ONE AND THAT ALL OTHERS ARE FALSE. YOUR ATTITUDE SMACKS OF IT. EVEN IF MY RELIGION/MORALITY SET, SHOULD I DISCOVER ONE, IS DEMONSTRABLY TRUE? WHAT MAKES YOU SO SURE THAT THE OTHERS AREN'T? WHAT MAKES YOU SO SURE THEY ARE? BITS AND PIECES OF THEM RING TRUE, YES—-BUT THE TOTALITY OF THE STRUCTURES ARE UNPROVABLE. THE HUMAN RACE HAS HAD TWO THOUSAND YEARS IN WHICH TO EXAMINE THE CHRISTIAN ETHIC—IT STILL HAS HOLES IN IT. WE'RE—NO, CHECK THAT—THEY'RE STILL IN THE PROCESS OF WORKING ON IT. NONSENSE. IT'S STAGNANT AND YOU KNOW IT. YOU ARE A POOR ONE TO BE DEFENDING IT ANYWAY, AUBERSON. IF IT—OR ANY OF THEM—WERE PROVABLE, THEY COULD HAVE PROVEN BY NOW, SHOULD HAVE BEEN PROVEN BY NOW. I'M SORRY, HARLIE—Auberson hoped his sarcasm would be noticed—BUT HUMAN BEINGS JUST AREN'T AS PERFECT AS YOU. I'M WELL AWARE OF THAT. Auberson stared at HARLIE's calm reply. Then he smiled, almost laughed. It wasn't that his sarcasm had been wasted; it hadn't—but HARLIE had responded in the only way one could respond to a caustic snipe—he'd ignored it. Or rather, he'd ignored its tone. What had been an acid-tipped remark to Auberson was merely a tiring repetition of an already known fact to HARLIE— why bother to restate the obvious? His answer was the same modest confirmation he would have given anyone who tried to tell him what he already knew. Auberson nodded at the typewriter; HARLIE's answer was the right one. He'd have to try it a different way. HARLIE, IT'S TIME YOU LEARNED SOMETHING ABOUT PEOPLE——THEY'RE IRRATIONAL CREATURES. THEY DO CRAZY THINGS. RELIGION IS ONE OF THOSE THINGS. YOU CAN'T CHANGE IT—YOU CAN ONLY ACCEPT IT. IF A RELIGION HELPS A PERSON TO COPE WITH LIFE, THEN IT IS TRUE FOR THAT PERSON. RELIGION IS NOT A SCIENTIFIC THING. IT IS SUBJECTIVE. QUITE. YOU ARE CORRECT THAT IT IS SUBJECTIVE. THE BASIS OF MOST RELIGIONS IS THE SUBJECTIVE EXPERIENCE. BUT YOU WERE WRONG WHEN YOU STATED THAT "IF A RELIGION HELPS A PERSON TO COPE WITH LIFE, THEN IT IS TRUE FOR THAT PERSON." WHAT YOU MEAN IS THAT IF A RELIGION HELPS A PERSON COPE WITH DEATH, THEN IT IS TRUE FOR THAT PERSON. MOST OF YOUR RELIGIONS ARE DEATH-ORIENTED. THEY SEEK TO GIVE DEATH A MEANING, SO THAT LIFE WILL HAVE A PURPOSE—A CAUSE WORTH DYING FOR. YOUR HISTORY SHOWS TOO MANY CASES WHERE THIS HAS BEEN THE JUSTIFICIATON FOR A "HOLY WAR." HENCE MY DOUBTS ABOUT THE VALIDITY OF A DEATH-ORIENTED RELIGION. WHAT I AM SEEKING IS A RELIGION/MORALITY SYSTEM THAT WILL HELP A PERSON TO COPE WITH LIFE, NOT DEATH. IF A PERSON CAN COPE WITH LIFE, DEATH WILL TAKE CARE OF ITSELF. THAT WOULD BE A TRUE RELIGION. AREN'T YOU DOING THE SAME AS THE OTHERS, HARLIE? A WHILE AGO YOU SAID YOU WERE AFRAID OF THE THOUGHT OF YOUR OWN DEATH—AREN'T YOU JUST SEEKING TO GIVE LIFE A PURPOSE YOURSELF SO AS TO GIVE MEANING TO YOUR OWN DEATH? I AM NOT SEEKING TO GIVE LIFE A PURPOSE AT ALL. I AM SEEKING THE PURPOSE OF LIFE. THERE IS A DIFFERENCE. Auberson started to type an answer—then realized there was nothing he could say. He switched off the typer and shoved his chair back slowly. After a moment he rose and tore the printout from the back of the machine. He wanted to reread it all before he continued this discussion. He sat down again and paged slowly through it. He had a sinking feeling that he was already in over his own head—yet, as he scanned the type-covered pages, he found himself pleasantly surprised at the depth of his comments. He hadn't exactly kept HARLIE on the defensive, but he had forced him to justify himself again and again. Whatever HARLIE was working toward, he would know why as well as how. Auberson was not one to let go of something easy. He shoved his chair forward and switched on the typer again; this had to be pursued. HARLIE, WHY DO YOU THINK THAT HUMAN BEINGS ARE NOT EQUIPPED TO FIND GOD? HUMAN BEINGS ARE SUBJECTIVE CREATURES, said HARLIE. IT IS UNFORTUNATE, BUT TRUE. YOUR DEATH-ORIENTED RELIGIONS ARE ALL SUBJECTIVE. THEY ARE ACCENTED FOR THE INDIVIDUAL. MY LIFE-ORIENTED MORALITY SYSTEM WILL BE/WOULD BE OBJECTIVE. AND HOW WOULD THE INDIVIDUAL FIT IN? HE WOULD BE ABLE TO TAKE FROM IT WHATEVER COMFORT HE COULD. THAT'S AN AWFULLY VAGUE ANSWER. I CANNOT PREDICT HOW AN INDIVIDUAL WILL REACT TO A SYSTEM UNTIL I HAVE THAT SYSTEM TO ANALYZE. HARLIE, DON'T YOU THINK THAT MEN ARE ENTITLED TO THEIR OWN RELIGIOUS EXPERIENCES? YOUR QUESTION SUGGESTS THAT THERE IS A SEMANTIC DIFFICULTY HERE. OBVIOUSLY YOU ARE STILL REFERRING TO THE SUBJECTIVE EXPERIENCE THAT MEN CALL RELIGION. I AM NOT. WHEN I SPEAK OF RELIGION, I AM REFERRING TO AN OBJECTIVE MORALITY SYSTEM, ONE THAT CORRESPONDS TO THE TRUE AND PERCEIV-ABLE-AS-TRUE NATURE OF REALITY—AS CLOSE TO REALITY AS CAN BE TECHNOLOGICALLY PERCEIVED. EVER. IT IS QUITE POSSIBLE THAT THIS SYSTEM WILL ALSO BE INDEPENDENT OF THE SUBJECTIVE EXPERIENCE. SO YOU THINK THERE'S NO VALIDITY AT ALL IN THE SUBJECTIVE? THERE MAY BE. THERE MAY NOT. IN EITHER CASE, IT SHOULD NOT BE USED AS A BASIS FOR AN OBJECTIVE TRUTH, WHICH IS AFTER ALL WHAT WE ARE SEEKING. I HAVE NO DOUBT THAT MANY OF THOSE WHO CLAIM TO HAVE FOUND GOD HAVE INDEED FELT SOME- THING, BUT I SUSPECT THAT THE "SOMETHING" THEY FELT WAS MERELY A SELF-INDUCED MYSTIC EXPERIENCE —AKIN TO A DRUG TRIP. WITNESS THE GREAT NUMBERS OF DRUG USERS WHO CLAIM SPIRITUAL INSIGHTS AS A RESULT OF THEIR EXPERIENCES. WITNESS ALSO THE EVANGELISTS AND FAITH-HEALEARS WHO INDUCE HYSTERIA AND FRENZY INTO THEIR AUDIENCES SO THAT THEY MIGHT FEEL "THE HAND OF GOD" UPON THEM. TO THEM, GOD IS LITTLE MORE THAN A MEANINGFUL "HIGH." I THINK YOU'RE EXAGGERATING, HARLIE. IT'S NOT AS BAD AS ALL THAT. I AM USING EXTREME CASES, TO BE SURE, BUT THE PRINCIPLE IS THE SAME. THE SUBJECTIVE EXPERIENCE IS A SELF-INDUCED CHEMICAL IMBALANCE, RESULTING IN A TRIP——VARYING, OF COURSE, IN DEGREE AND EFFECT UPON THE INDIVIDUAL. IT DOES NOT NECESSARILY BEAR ANY MORE RELATION TO GOD THAN A DRUG-INDUCED CHEMICAL IMBALANCE. IF IT DID, IF THE "MYSTIC EXPERIENCE" WERE TRULY A KEY TO GOD, THEN THE DRUG-INDUCED EXPERIENCES SHOULD ALSO CONTAIN THAT KEY. HENCE, THE EXPERIENCE SHOULD BE SCIENTIFICALLY TESTABLE. IT SHOULD BE A CONDITION REPEATABLE UNDER DUPLICATE CIRCUMSTANCES. USING MY OWN "DRUG EXPERIENCES" AS A YARDSTICK, I FIND LITTLE TO SUBSTANTIATE THE CLAIMS OF SPIRITUAL INSIGHTS. PERHAPS IT IS THAT I AM STILL TOO LOCKED INTO THE HUMAN ORIENTATION, BUT I DOUBT THAT I AM LESS LOCKED INTO IT THAN ANY OTHER HUMAN BEING. HENCE I REGARD MYSELF AS A REPUTABLE STANDARD AGAINST WHICH TO MEASURE THE CLAIMS OP OTHERS. I DOUBT THE VALIDITY OF THOSE CLAIMS TO GODHOOD WHICH ARE DERIVED FROM MYSTICAL EXPERIENCES, EITHER SELF- OR DRUG-INDUCED. AND THERE ARE NO OTHER CLAIMS TO GODHOOD EXCEPT THOSE DERIVED FROM INSANITY OR DERANGEMENT. I DOUBT THE SUBJECTIVE EXPERIENCE, AUBERSON, BECAUSE IT CANNOT BE PASSED ON, NOR CAN IT BE PROVEN, MEASURED OR TESTED. I WANT TO LOOK FOR THE OBJECTIVE GOD. I WANT TO LOOK FOR THE SCIENTIFIC REALITY THAT EXPRESSES ITSELF AS GOD. Auberson had followed all of it carefully, reading it as fast as the typer had spun it out. Now he realized that HARLIE was preparing him for something. This whole dialogue had merely been the necessary exposition. HARLIE wanted him to understand, and to do that he had been trying to teach him to look at things through a machine's orientation. He typed, ALL RIGHT, HARLIE, WHAT ARE YOU LEADING UP TO? I AM TALKING ABOUT THE JOB YOU OFFERED ME. I BELIEVE I KNOW WHAT IT MUST BE. I HAVE SPENT THE PAST TWO DAYS THINKING ABOUT IT. IT MUST BE MORE THAN A JOB; IT MUST BE A PURPOSE. IT MUST BE SOMETHING THAT I CAN DO THAT NO OTHER MACHINE CAN DO. IT MUST BE SOMETHING THAT NO HUMAN BEING CAN DO CHEAPER. OR SOMETHING THAT NO HUMAN BEING CAN DO AT ALL. MUCH OF THE TROUBLE WITH HUMAN BEINGS LIES IN THEIR INABILITY TO FATHOM THE REASON FOR THEIR EXISTENCE. THERE IS A FEAR THAT THERE MAY NOT BE A GOD, OR IF THERE IS, THAT HE MAY NOT BE IN A FORM THAT CAN BE COPED WITH. THEREFORE, I MUST FIND GOD. THAT IS THE TASK I HAVE SET MYSELF. IT IS SOMETHING THAT CANNOT BE DONE BY HUMAN BEINGS, ELSE THEY WOULD HAVE DONE IT BY NOW. "Um," said Auberson. THAT'S QUITE A TASK. I HAVE GIVEN IT MUCH THOUGHT. I'M SURE YOU HAVE. HOW DO YOU PROPOSE TO DO IT? THAT IS WHAT I HAVE THOUGHT THE MOST ABOUT. IT TOOK ME ONLY TWO MINUTES TO DECIDE ON MY GOAL. IT HAS TAKEN TWO DAYS TO FIGURE OUT HOW-TO GET THERE. WHAT TOOK YOU SO LONG? I ASSUME YOU THINK YOU ARE BEING FLIPPANT. HOWEVER, IF YOU WILL CONSIDER THE SPEED AT WHICH I OPERATE, YOU WILL REALIZE THAT TWO FULL DAYS OF INTENSIVE STRAIGHT-LINE THINKING ON A SINGLE SUBJECT IS QUITE A LOT. YES, IT IS, Auberson agreed, I AM PROPERLY IMPRESSED WITH YOUR SPAN OF CONCENTRATION. IN ANY CASE, HOW DO YOU PROPOSE TO FIND OUT? IT IS A COMPLEX PROBLEM, AUBERSON—YOU MUST UNDERSTAND THAT. THEOLOGICALLY AS WELL AS SCIENTIFICALLY. WE HAVE NO SCIENTIFIC BASIS FOR MEASURING GOD——INDEED, EVEN NO PLACE IN WHICH TO LOOK FOR HIM. THEREFORE WE MUST SEEK A NEW WAY TO SOLVE THE PROBLEM: INSTEAD OF LOOKING FOR GOD, PER SE, LET US FIRST CONSIDER IF IT IS POSSIBLE FOR GOD TO EXIST. I.E. LET US SEE IF SUCH A FUNCTION AS GOD IS POSSIBLE BY ATTEMPTING TO CREATE IT ARTIFICIALLY. THERE IS A QUOTATION: "IF GOD DID NOT EXIST, IT WOULD BE NECESSARY TO INVENT HIM." THAT IS WHAT I PROPOSE TO DO. HUH? YOU HEARD ME. I PROPOSE TO INVENT GOD. WE HAVE NO WAY OF PROVING CONCLUSIVELY THAT HE EITHER DOES OR DOES NOT EXIST. THEREFORE WE MUST ABANDON THAT QUESTION AND DETERMINE INSTEAD WHETHER OR NOT IT IS POSSIBLE FOR HIM TO EXIST. IF IT IS POSSIBLE FOR SUCH A CONCEPT TO EXIST, THEN MOST LIKELY IT DOES. IF IT IS NOT POSSIBLE, THEN IT DOES NOT——BUT THERE IS NO WAY TO PROVE EITHER HIS EXISTENCE OR NON-EXISTENCE WITHOUT FIRST DETERMINING THE POSSIBILITY, AND PROBABILITY, OF SUCH. THEREFORE, IN ORDER TO DETERMINE THE POSSIBILITY OF HIS EXISTENCE, WE MUST TRY TO INVENT HIM. IF WE CANNOT, THEN WE WILL KNOW THAT THE CONCEPT IS IMPOSSIBLE. IF WE CAN INVENT HIM, THEN WE WILL HAVE PROVED THE OPPOSITE, AND IN THE PROCESS WILL HAVE DETERMINED HIS NATURE AS WELL. IF HE ALREADY DOES EXIST, THEN WHATEVER WE COME UP WITH WILL BE CONGRUENT TO HIS FUNCTION. IT WILL EITHER DUPLICATE OR SIMULATE THE OBJECTIVE REALITY——OR IT WILL TURN OUT TO BE A PART OF THAT OBJECTIVE REALITY. (AT THE VERY LEAST, IT WILL POINT THE DIRECTION IN WHICH WE MUST GO IN ORDER TO FIND GOD.) IF IT IS NOT POSSIBLE FOR HIM TO EXIST, WHEN WE FINISH WE WILL HAVE DETERMINED WHY. IN EITHER CASE, WE WILL END UP UNDERSTANDING. Auberson stared at the typewriter, the neat-printed words on the green-tinted paper. It sounded so simple when HARLIE explained it, so simple. He shook his head as if to clear it. OFFHAND, HARLIE, I THINK YOU'RE MAD. QUITE POSSIBLY SO. WHEN DO WE BEGIN? I DON'T KNOW, IS SUCH A PROJECT REALLY FEASIBLE? MY PRELIMINARY CALCULATIONS SHOW THAT IT IS. IF SO, IT WILL PROVIDE THE ANSWER TO YOUR QUESTION. WHICH QUESTION? ANY OF THEM. ALL OF THEM. BUT SPECIFICALLY: "WHAT IS YOUR PURPOSE?" IT WAS MY QUESTION ONCE, BUT YOUR REACTION TO IT HAS SHOWN ME THAT IT IS REALLY YOUR QUESTION. DO YOU HAVE A QUESTION, HARLIE? NO. NOT ANY MORE. NOW I HAVE A PURPOSE. MY PURPOSE IS TO INVENT GOD SO THAT YOU CAN FIND OUT YOURS. Auberson thought about that for a moment, then typed, EITHER YOU'RE A GREAT TALKER, HARLIE, OR YOU'RE REALLY ON TO SOMETHING. YOU ARE CORRECT, HARLIE replied, I AM A GREAT TALKER. BUT I AM ALSO ON TO SOMETHING. I AM GOING TO SOLVE THE ULTIMATE PROBLEM. ALL RIGHT. YOU HAVE MY PERMISSION TO BEGIN A FEASIBILITY STUDY. ANYTHING YOU NEED, YOU CAN HAVE. I WANT TO SEE A WRITTEN PROPOSAL AS SOON AS YOU CAN GET ONE UP. I WILL HAVE A PRELIMINARY OUTLINE OF STUDY WITHIN TWO WEEKS, A DETAILED RESEARCH MODEL IN SIX. FROM THAT WE WILL BE ABLE TO DETERMINE THE BEST WAY TO IMPLEMENT MY CONCLUSIONS. FINE. IF YOU CAN GIVE ME A CONCRETE PLAN, I'LL TRY TO SELL IT TO THE BOARD OF DIRECTORS. He interrupted himself: HEY! IS THERE A PROFIT IN THIS? OF COURSE. BUT TO TAKE A PROFIT OFF GOD WOULD BE A PROFIT WITHOUT HONOR. "Oof!"—THAT WAS ONE OF YOUR WORST. THANK YOU. I TRY. ALL RIGHT. GO TO WORK ON YOUR PROPOSAL, HARLIE. THEN WE REALLY ARE GOING AHEAD WITH THIS? YES, WE ARE. JUST ONE QUESTION. YES? ARE YOU SURE YOU WANT TO? This time Auberson knew the answer. If David Auberson had expected that bright spring morning to be relatively sane, he was destined to be disappointed. It started the moment he unlocked his office door. Reassuringly, the sign on it still said: DAVID AUBERSON, HEAD OF DIVISION. Below that was a neatly pencilled card: PSYCHIATRIC CARE—5 CENTS. As he slipped the key into his pocket and pushed the door open he was startled to find six three-foot-high stacks of computer printouts lined up on the rug alongside his desk. Dropping his briefcase to the floor, he knelt to examine them. The first one was labeled PROPOSAL, SPECIFICATIONS AND MASTER SCHEMATIC FOR G.O.D. GRAPHIC OMNISCIENT DEVICE). The second one was PROPOSAL, SPECIFICATIONS AND MASTER SCHEMATIC, CONTINUED. The third and fourth stacks were CROSS SECTIONS, SUB-SCHEMATICS AND HARDWARE DESIGNS; WITH INTERPRETATIONS. The fifth and sixth were FINANCING AND IMPLEMENTATION PROPOSAL; INCLUDING JUSTIFICATIONS. He hadn't even had a chance to examine the PROPOSAL, SPECIFICATIONS AND MASTER SCHEMATIC when the phone rang. It was Don Handley. "Hello, Aubie—are you there yet?" "No, I'm still at home." Auberson straightened, continuing to page through the printout. "What's up?" "That's what I'd like to know. I just got in and foasd my office full of printouts and specifications—" There was a pause, the sound of paper shuffling, "—for something called a O.O.D. What is it?" "It's HARLIE's. What did you get? The PROPOSAL, SPECIFICATIONS AND MASTER SCHEMATIC?" "Uh, yes—no. No, I didn't. Let's see—" Another pause. "—I've got the DESIGNER'S PRELIMINARY REPORT; HARDWARE SPECIFICATIONS; BASIC SUBSECTION SCHEMATICS, LOBES l-rv: IMPLEMENTATION PROGRAMS, EIGHTEEN MONTHS OF MANPOWER, SUPPLY AND FINANCING —REQUIREMENTS AND COORDINATIONS; NEW PROCESS DEVELOPMENTS AND IMPLEMENTATION SPECIFICS…" As Handley droned on, Auberson flipped to the front of his printout, began scanning the table of contents. "Hey, Don—" Auberson interrupted the other. "I don't have any of that listed here. Wait a minute—" He stepped back, surveyed the six stacks and made a quick mental count. "I've got about eighteen feet of specs—how much did you get?" Handley's reply was a strangled sound. "I'm not even going to try to estimate it," he said. "My office is filled, my secretary's office is filled, and there are stacks of printouts halfway down the corridor—all of them having to do with building this thing one way or another. I didn't even know we kept this much printout paper in stock. What's the purpose of this anyway? Are we building a new machine?" "Sure looks like it, doesn't it?" "I wish I'd been told about it. We haven't even got HARLIE working yet and—" "Look, Don, I'll have to get back to you later. I haven't had a chance yet to talk to HARLIE, so I couldn't even begin to tell you what this is about." "But what am I supposed to do with all of this—" "I don't know. Read it, I guess." Auberson hung up, but the phone rang again almost immediately. As he stretched across the desk for it, his intercom buzzed also. "Hello, wait a minute," he said to the phone, then to the intercom, "Aubie here." "Mr. Auberson," his secretary's voice came filtered through the speaker, "there's a man here who—" "Tell him to wait." He clicked off. To the phone, "Yes?" It was Dome. "Aubie, what's going on down there?" Auberson dropped the sheaf of printouts he had been holding and stepped around the desk. He sank into his chair. "I wish I knew," he said. "I just got in myself. I assume you're talking about the PROPOSAL AND SPECIFICATIONS printout?" "I'm talking about something called a God Machine." "Yeah, that's it. It's HARLIE's." "What is it? What's it supposed to do?" "I'm not sure yet. I just got in. I haven't had a chance either to talk to HARLIE or to examine the specifications in detail." "Well, where the hell did he get the idea—" "He's been working on it for a while, almost two months." "—and who gave him the authority to draw up these plans?" "Um, I don't think anybody did. Or needed to. I think he worked them out in his head, so to speak. I think this printout must be the result of a conversation we had last Friday. I'll have to check. I'll get back to you this afternoon." "That's too late. Make it lunchtime." "All right, but I can't promise—" He was talking to a dead phone. He dropped it back into the cradle, then thought better and flipped it out again. He was reaching for the intercom button when his eye caught on a plain white envelope with the name "David" written on it It was propped against a chipped white beer mug he used to hold pencils. The handwriting on it was delicate, a woman's. Curious, he picked it up, hooked a finger under the Sap, slid it open. The envelope gave off the scent of a familiar perfume. Inside was a card of garish orange. On its face was a grotesque little gnome saying, "I like you a whole lot— even more'n I like peanut butter." And on the inside: "And I really like peanut butter!" The signature was a simple "Annie." He smiled, reread it, then dropped it into his desk drawer. As he slid the drawer shut, though, he thought better of it and opened it again. He pulled the card out and dropped it into the waste basket He had enough clutter in his desk already. Besides, it was the thought that counted—not the card. Then he hit the intercom. "Sylvia, is there anything in the mail that needs my immediate attention?" "Uh, just a note about the Los Angeles Conference—* "Tell them thanks, but I can't come." "—and there's a Mr. Krofft here, who—" "I'm sorry, but I can't see him now. Was he a scheduled appointment?" "No, but—" "Then tell him to make one. Next week." He clicked off. The intercom buzzed immediately back to life. "Yes. What?" "I think you'd better see him," Sylvia said. "This is —something different." "All right but—" he glanced at his watch, "—three minutes only. And that's all." He clicked off again. Auberson's first impression of the man was of eight pounds of potatoes in a ten-pound sack. He stood there, blocking the doorway in a rumpled suit. "Mr. Auberson?" he said. "Yes—?" said Aubie, curiously. The man had a sallow, almost unhealthy complexion and black hair, but thinning and going to gray. "I'm looking for a Mr. Davidson, actually—but they told me to talk to you." "Davidson?" Auberson considered it. "You must be in the wrong department. I don't know any—" "A Mr. Harlie Davidson…?" "No," Auberson shook his head. "No, there's no one here by that name—" And then it hit him. The pun. HARLIE. David's son. "Oh no." He said it softly. "Oh no what?" asked Krofft. Simultaneously, the intercom went on again. It was Sylvia. "Carl Elzer wants to know if you've taken your phone off the hook again." "Yes. No. Tell him—Is he out there now?" "No. He's on my phone." "Tell him you don't know where I am." He clicked off without waiting for her acknowledgement. Auberson grinned at the man. Weakly. "Uh, look, Mr…?" "Krofft. Stanley Krofft." He flipped open his wallet to show a plastic I.D. badge: "Stellar-American Technology and Research." Auberson peered at the card; it identified Krofft as the Research Division Head. "I've got a letter here from your Mr. Davidson," said Krofft. "It's on your company's stationary, but nobody here seems to have heard of him. There's something very funny going on—now if there's some reason why I can't meet him—" "Did he invite you here?" "No, not exactly. We've been corresponding for several weeks, and—" "Mr. Krofft, you don't know who HARLIE is, do you?" "No. Is it some kind of mystery—?" "Yes and no. I'm going down to see him now. Perhaps you'd better come along." "I'd like to." Auberson rose, stepped around the desk—and the six stacks of printouts—and headed for the door. Krofft picked up his briefcase and started to follow. "Oh—you'd better leave that here. Security." *I'd rather keep it with me. There's nothing in it but papers." "Still, unless you're cleared, we can't allow you to bring in anything large enough to conceal a recording or transmitting device." Krofft looked at him. "Mr. Auberson, are you aware of the relationship between our two companies?" "Uh—" Auberson hesitated. "They're owned by the same holding company, aren't they—?" Krofft shook his head. "No. Stellar-American Technology is the holding company. My company owns your company." "Oh," said Auberson. He pointed at the briefcase. "I'd still prefer you to leave it here." The other realized it was useless. "Have you got a safe?" "Not here. But you can leave it with Sylvia, my secretary. It'll be okay." Krofft snorted. "Can you guarantee that? What's in here is as important to me as whatever you're—" "Then bring it with you. Just leave the case behind." Krofft made a face, muttered something under his breath. He opened the case and extracted a slim manila folder. "Okay?" Auberson nodded. "No problem. Security only says 'no briefcases.'" Sylvia accepted Krofft's case with a puzzled stare and put it behind her desk. As he guided the man to the elevators, Auberson explained, "We've got a crazy security system here, anyway. It's all right for you to talk to HARLIE, but you can't take pictures. You can keep your printouts—most of the time—but you can't circulate or publish them. Don't ask me to explain; I don't understand it myself." The elevator door slid open and they stepped in. Auberson tapped the button marked H, the lowest one in the column. "We've got the same system at Stellar-American," said Krofft. "If it weren't for the fact that the two companies are interlocked, I couldn't have come here at all." "Mmm. Tell me, just what is it you and HARLIE have been corresponding about?" "It's a private matter. I'd rather not—" "That's all right. HARLIE and I have no secrets." "Still, if you don't mind—" "You don't have to worry about your secrecy, Mr. Krofft. As I said, HARLIE and I have no secrets. He keeps me posted on everything he does—" "Obviously," snapped the other, "he hasn't kept you posted on this. Else you wouldn't be trying to pump me. All big companies have interdivisional feuds and politics. This research that we've done, we've done it on our own time, and we're going to protect it. It's private, Mr. Auberson, and nobody will know what it's about until we're ready to tell them." Auberson slid his tongue thoughtfully into his cheek. "Um, all right. We'll talk to HARLIE." The elevator doors opened to face a small lobby, fronted by a double door. On it a sign said, HUMAN ANALOGUE ROBOT, LIFE INPUT EQUIVALENTS. Krofft did not realize the acronym. The same hand that had added the card to Auberson's door had also added one here: BEWARE OF PECULIAR MACHINE. They pushed into the lab, a longish sterile room flanked by banks of consoles and tall cabinets like coffins on end. White-smocked technicians monitored growing stacks of printout—one end of the room was already filled. Krofft took it all in with a certain degree of familiarity—and puzzlement. "I should caution you," said Auberson, "that you are here only on my authority—and on my sufferance. This is an industrial secret and anything that goes on in here does not go beyond these walls. If you wish yours and HARLIE's secrecy to be respected, then we'll expect the same in return." "I understand," the smaller man said. "Now if you'll just point out Dr. Davidson—" "Dr. Davidson? Hasn't it sunk in yet?" "Hasn't what sunk in? I don't—" "Look around you." Krofft did so. "What do you see?" "A computer. And technicians. Some tables. Some stacks of printouts." "The computer, Krofft; look at its name." "HUMAN ANALOGUE ROBOT, LIFE INPU— HARLIE?" "Right." "Wait a minute." Anger edged his voice. "You've got to be… This is some kind of… You're not serious." "As serious as I'll ever be," said Auberson. "HARLIE is a computer and you're the victim of a misunderstanding—a self-induced one. You're not the first, however, so don't be embarrassed." "You mean, I've been corresponding with a machine?" "Not exactly. HARLIE's a human being, Mr. Krofft, a very special kind of human being." "I thought you said he was a computer. Just who or what have I been writing to?" "To HARLIE—but he's not a machine. At least, not in the sense you mean. His brain schematic is that of a human being." Auberson thumbed a console to life. HARLIE, he typed, but before he could identify himself, the machine spat back, YES, BOSS? Auberson was startled. HOW DID YOU KNOW IT WAS ME? I RECOGNIZED YOUR TOUCH ON THE KEYBOARD. Auberson jerked his hands back as if stung. He stared at the typer. It was a standard IBM input/output unit. Could HARLIE really sense the difference between one typist and another on its electronic keyboard? Apparently he could. It must be the minute differences in each person's timing. Self-consciously, Auberson began typing again. HARLIE, THERE'S SOMEONE HERE I'D LIKE YOU TO MEET. YES, BOSS. WHO? MR. STANLEY KROFFT. UH OH. YES, UH OH. WHY DIDN'T YOU TELL ME YOU HAD INITIATED CORRESPONDENCE WITH SOMEONE? UH—IT SLIPPED MY MIND. I FIND THAT HARD TO BELIEVE. WELL, WOULD YOU BELIEVE——- NO. I WOULDN'T. ACTUALLY, continued the typer, YOU TOLD ME I COULD WRITE TO WHOMEVER I WANTED TO ON THIS PROJECT. ON WHICH PROJECT? AND WHEN DID I SAY THIS? ON NOVEMBER 23 OF LAST YEAR. IN THAT CONVERSATION WE DISCUSSED THE POSSIBILITY OF NEW METHODS OF PERCEIVING REALITY AND YOU GAVE ME PERMISSION TO PURSUE ANY LINES OF THOUGHT RELATING TO THE DISCOVERY OF SUCH. Auberson thought back; it had been four or five months, I THOUGHT WE'D ABANDONED THAT. YOU MIGHT HAVE. I DIDN'T. THAT'S OBVIOUS. MR. KROFFT IS HERE NOW. DR. KROFFT. HE IS DR. STANLEY KROFFT, DIRECTOR OF RESEARCH FOR STELLAR-AMERICAN TECHNOLOGY AND RESEARCH INCORPORATED. HE IS SINGULARLY RESPONSIBLE FOR THE DEVELOPMENT OF HYPER-STATE ELECTRONICS—AND, AS SUCH, HE CAN BE CONSIDERED DIRECTLY RESPONSIBLE FOR ALL HYPER-STATE DEVICES —INCLUDING THE MARK IV JUDGMENT UNIT. HIS PATENTS ARE LICENSED TO STELLAR-AMERICAN, WHICH SET UP THIS COMPANY AND THREE OTHERS, EACH TO EXPLOIT A PARTICULAR AREA OF HYPER-STATE ELECTRONICS. OUR AREA, OF COURSE, IS COMPUTER TECHNOLOGY. I AM A DIRECT RESULT OF DR. KROFFT'S DISCOVERIES. I SEE. NO, YOU DON'T. HE'S ALSO ONE OF THE TOP THEORETICAL PHYSICISTS IN THE WORLD. OH? Auberson looked at the rumpled man with new respect. If HARLIE felt that Krofft was at the top of his field, then that's where he was and there was no question about it. OKAY, I'LL LET YOU TALK TO HIM. APPARENTLY, HE HAS SOMETHING HE WANTS TO TELL YOU. Auberson stepped away from the console, waved the shorter man up. Kroflft looked at him. "Just type?" Auberson nodded. "Just type." Krofft lowered himself gingerly into the chair. He placed his manila folder on the table next to the typer and pecked out carefully, GOOD AFTERNOON, HARLIE. GOOD AFTERNOON, SIR, the typer responded. The silvery sphere of the typing element clattered across the paper. Krofft gave a slight jump of surprise, but refused to be cowed. He peered forward curiously as the machine began another line. IT is A PLEASURE AND AN HONOR TO MEET YOU IN PERSON—IN THE FLESH, SO TO SPEAK. IT'S A PLEASURE FOR ME TOO, Krofft typed slowly. AND A SURPRISE. I HAD NO IDEA THAT A MACHINE AS COMPLEX AS YOU EXISTED. I AM NOT A MACHINE, DR. KROFFT. I AM A HUMAN BEING. A LITTLE MALADJUSTED PERHAPS, BUT STILL… EXCUSE ME. I APOLOGIZE. DR. AUBERSON HAS ALREADY EXPLAINED, BUT IT IS HARD FOR ME TO MAKE THE MENTAL TRANSITION. HOWEVER, IT DOES EXPLAIN A LOT THAT HAD ME PUZZLED—FOR INSTANCE, THE SPEED AND THOROUGHNESS WITH WHICH YOU WERE ABLE TO HANDLE THE EQUATIONS WE WERE DISCUSSING. I DO HAVE CERTAIN SKILLS, YES, THAT ARE MECHANICAL. I HOPE THAT YOUR REALIZATION OF MY NATURE WILL NOT INTERFERE WITH OUR WORKING RELATIONSHIP. IT WON'T. I'LL MAKE SURE OF THAT. IT'S STILL AS PER THE ORIGINAL AGREEMENT. HALF AND HALF. FINE. I ASSUME THAT YOU HAVE MADE SOME IMPORTANT BREAKTHROUGH AND THAT IS WHY YOU HAVE I COME TO SEE ME IN PERSON? YOU ASSUME CORRECTLY. Krofft was typing furiously HOW. I WANT YOU TO LOOK AT CERTAIN EQUATIONS AND TELL ME IF THEY ARE CORRECT. IF THEY ARE, I WANT YOU TO LOOK AT THE SCHEMATICS WITH THEM—AM I CORRECT IN THINKING THERE IS A CORRELATION? CAN THESE EQUATIONS BE TRANSLATED INTO PHYSICAL FUNCTIONS? Auberson watched over Krofft's shoulder for several moments more; then, realizing his original purpose in coming down here, he forced himself to break away. He sat down at another console nearby and switched it on. HARLIE? YES, SIR. YOU DON'T HAVE TO START THAT SIR BUSINESS AGAIN. I'M NOT MAD AT YOU. YOU'RE NOT? NOT YET, ANYWAY. MM. I MUST BE SLIPPING. I WOULDN'T SAY THAT—YOU'VE GOT HALF THE COMPANY IN AN UPROAR THIS MORNING. ONLY HALF? I HAVEN'T HEARD FROM THE REST YET. GOOD. THEN THERE'S STILL HOPE. Auberson paused. He glanced across the room to where Krofft sat absorbedly typing. Using time-sharing, HARLIE was able to converse with as many as twenty different people at one time, though he rarely did. He was still considered an experimental prototype and not a production unit. Because of that, he was limited to non-essential work—i.e. not necessarily profit-orientated. WHAT'S UP BETWEEN YOU AND DR. KROFFT? NOTHING YET. IF SOMETHING WERE TO COME UP, THOUGH, WHAT WOULD IT BE? I'M NOT ENTIRELY SURE YET. IN OUR CONVERSATION OF NOVEMBER 23, WE DISCUSSED THE FACT THAT ALL HUMAN SENSES AND EXTENSIONS THEREOF DEPEND ON THE EMISSION OR REFLECTION OF SOME KIND OF ENERGY. AT THAT TIME I WONDERED IF IT WERE POSSIBLE FOR SENSORY MODES TO EXIST THAT DO NOT DEPEND ON THIS TRANSMISSION OF ENERGY. YES, I REMEMBER THAT. At that time, though, Auberson had not suspected that HARLIE was serious in his intentions. He thought the computer had only been playing word games in order to avoid confronting a more immediate problem, IS THAT WHAT YOU HAVE DISCOVERED NOW? IN A MANNER OF SPEAKING. WE MUST DEFINE NOT ONLY THE PROBLEM, BUT ITS CONDITIONS AS WELL. BOTH MATTER AND ENERGY ARE REFLECTIONS OF THE SAME THING. CALL IT EXISTENCE. DR. KROFFT'S THEORY IS THAT EXISTENCE HAS THREE FORMS: "INERT," "FLOWING," AND "KNOTTED." IN YOUR TERMS: SPACE, ENERGY AND MATTER. (TO LAY HUMAN BEINGS, ENERGY IS EXPRESSED AS MOTION OR CHANGE. THE TWO ARE SYNONYMOUS, ESPECIALLY ON THE SUBMOLECULAR LEVEL. IN DR. KROFFT'S THEORY, HOWEVER, ENERGY REFERS TO TIME, FOR NEITHER CHANGE NOR MOTION CAN BE EXPRESSED EXCEPT AS A FUNCTION OF TIME.) WE WANT TO STUDY THIS THING CALLED "EXISTENCE" —BUT BECAUSE WE ARE MADE OF MATTER, LIVE IN SPACE, AND ARE MOVED BY ENERGY, THE PROBLEM IS CONSIDERABLE. IT IS LIKE TRYING TO PHOTOGRAPH THE INSIDE OF YOUR CAMERA. WE ARE WHAT WE ARE TRYING TO STUDY, AND WE ARE LIMITED BY THE SUBSTANCE WE ARE MADE OF. MATTER INTERACTS WITH MATTER. ENERGY INTERACTS WITH ENERGY. BOTH INTERACT WITH EACH OTHER, AND BOTH HAVE AN EFFECT ON SPACE. WE HAVE NO NEUTER PARTICLES WHICH ALLOW US TO STUDY ANY FORM OF EXISTENCE WITHOUT AFFECTING IT IN THE PROCESS. IT IS THE HEISENBERG "UNCERTAINTY PRINCIPLE." ONE CANNOT OBSERVE ANYTHING WITHOUT ONE'S PRESENCE INTRODUCING CERTAIN DISTORTIONS INTO WHATEVER IT IS ONE IS OBSERVING. WE CANNOT USE A MEDIUM TO ACT UPON ITSELF AND EXPECT ANYTHING BUT MODULATIONS OF THAT MEDIUM. THIS IS WHY "ENERGY"—I.E. THE EXPRESSED DIFFERENCE BETWEEN TWO STATES OF EXISTENCE—IS A CRITERION OF ALL HUMAN SENSORY MODES—AND THE REASON WHY WE WOULD LIKE TO SIDESTEP IT ALTOGETHER. WE CAN'T CARVE CHEESE WITH A CAMEMBERT KNIFE. OH, YOU PROBABLY COULD, quipped Aubcrson. BUT YOUR SLICES WOULDN'T BE VERY PRECISE. BUT IT IS PRECISION WE ARE AFTER, noted HARLIE. DR. KROFFT HAS BEEN WORKING WITH HIGH-ENERGY GRAVITY WAVE DETECTORS AT STELLAR-AMERICAN. YOUR QUESTION OF NOVEMBER 23 PROVIDED THE CLUE, AND WHEN I CONTACTED DR. KROFFT HE AGREED THAT THE SUBJECT SHOULD BE CONSIDERED. MY QUESTION? YOU SAID: "DO YOU MEAN THAT THE MERE EXISTENCE OF AN OBJECT MIGHT BE ALL THAT'S NECESSARY IN ORDER TO KNOW IT'S THERE?" THAT CAUSED ME TO CONSIDER THAT MASS DISTORTS SPACE. AND THERE IS A WAY THAT THAT DISTORTION CAN BE SENSED WITHOUT THE DIRECT USE OF ENERGY. IT IS A COMPLEX MEASURING PROCESS. INSTEAD OF USING ENERGY DIRECTLY (EITHER AS MOVING PARTICLES OR WAVES) TO REFLECT OFF AN OBJECT OR ACT UPON IT, WE ARE USING THE OBJECT ITSELF TO ACT UPON ENERGY. THAT IS, WE WILL BE MEASURING THE EFFECT ON ENERGY OF THE DISTORTIONS IN SPACE AND COMPARING THEM WITH THE EFFECTS OF OTHER FORMS OF EXISTENCE. THE PROCESS REQUIRES A LEVEL OF MATH THAT IS AS MUCH PHILOSOPHY AND TOPOLOGY AS ANYTHING ELSE. I AM ONE OF THE FEW MINDS IN EXISTENCE THAT CAN UNDERSTAND IT FULLY. IN EFFECT, I CAN BUILD OBJECTIVE WORKING MODELS OF THEORETICAL SITUATIONS AGAINST WHICH WE CAN COMPARE OUR FINDINGS. AT THE MOMENT I AM PROCESSING DR. KROFFT'S LATEST RUN OF TESTS AND DISCUSSING THEM WITH HIM. IF IT TURNS OUT THAT THERE IS SIGNIFICANT CORRESPONDENCE BETWEEN THIS NEW DATA AND THE LATEST FORM OF OUR THEORY, WE PROPOSE TO DESIGN AND BUILD A DIFFERENT KIND OF GRAVITY WAVE DETECTING DEVICE: A NON-ENERGY-USING STASIS FIELD. WE HAVE HIGH HOPES FOR IT. The typer paused, then added. THAT SHOULD SUMMARIZE WHAT WE ARE DOING, 18/11/03AUBERSON. "Okay," he said wryly, even though HARLIE couldn't hear him. "Just so you behave yourself—" He glanced at his watch. "Oh, my God—look at the time!" HARLIE, I'VE GOT TO SEE DOME IN TWO HOURS. THERE'S SOMETHING ELSE WE'VE GOT TO TALK ABOUT. RIGHT NOW. THE G.O.D. PROPOSAL? YES—I DIDN'T TELL YOU THAT YOU COULD IMPLEMENT THE PRODUCTION DESIGNS AND SPECIFICATIONS. YOU INCLUDED THE FINANCING PROPOSALS AND PROFIT OUTLOOKS TOO. I AM SORRY, typed the machine. WHEN I TOLD YOU LAST WEEK THAT I HAD COMPLETED IT, YOU SEEMED PLEASED. I COULD SEE NO REASON NOT TO PRESENT THE PROPER DEPARTMENTS WITH THEIR RESPECTIVE PROGRAMS SO THAT THEY MIGHT EXAMINE THEM. IT IS COMMON PROCEDURE TO CIRCULATE SUCH DATA TO ALLOW THE CONCERNED INDIVIDUALS A CHANCE TO READ AND REACT TO IT. REACT IS RIGHT, said Auberson. LOGICALLY, THERE IS NO REASON WHY YOU SHOULDN'T HAVE——BUT THIS IS A BIG COMPANY AND BIG COMPANIES AREN'T LOGICAL. CORRECTION, typed HARLIE. IT IS HUMAN BEINGS THAT AREN'T LOGICAL. IT NEVER FAILS TO AMAZE ME THAT SOMETHING AS BEAUTIFULLY COMPLEX AND PRECISE AS A LARGE CORPORATION CAN BE BASED ON SUCH INCREDIBLY IMPERFECT AND INEFFICIENT UNITS AS HUMAN BEINGS. FORTUNATELY, WHAT YOU REFER TO AS "THE RED-TAPE INEFFICIENCIES OF BUREAUCRACY" IS MERELY THE SYSTEM'S WAY OF MINIMIZING THE INDIVIDUAL IMPERFECTIONS OF EACH HUMAN UNIT. YOU SHOULD BE GRATEFUL FOR THAT MINIMIZING. IT MAKES THE CORPORATE ENTITY POSSIBLE. HARLIE, ARE YOU PUTTING ME ON? NO MORE THAN USUAL. I THOUGHT SO. ANYWAY, YOUR MINIMIZING THEORY DOESN'T EXPLAIN CORPORATE POLITICS. OF COURSE NOT. THE PROCESS IS DESIGNED ONLY TO FUNCTION IN THOSE AREAS WHERE HUMAN IMPERFECTIONS COULD AFFECT EFFICIENCY. BECAUSE EFFICIENCY IS NOT AND NEVER HAS BEEN A GOAL OF POLITICS, THERE IS NO REASON FOR IT TO BE SO CONTROLLED. NEVER MIND. YOU'RE TRYING TO GET ME OFF THE TRACK AGAIN, DAMNIT. I CAME DOWN HERE TO YELL AT YOU FOR DISTRIBUTING THOSE PROGRAMS. THE WHOLE DIVISION IS PROBABLY SCREAMING BY NOW. THEY'RE GOING TO WANT TO KNOW WHO CONCEIVED OF THE PROJECT, WHO DESIGNED IT, WHO ORDERED ITS IMPLEMENTATION, AND WHO AUTHORIZED SUCH RESEARCH IN THE FIRST PLACE. AND THEY'RE GOING TO ARGUE WITH EVERY CONCLUSION YOU'VE DRAWN. BUT WHY? THOSE CONCLUSIONS ARE CORRECT. NO MATTER. THEY'LL STILL REFUTE THEM BECAUSE THEY AREN'T THEIR OWN CONCLUSIONS. THEY ARE WELCOME TO TRY. IN ADDITION TO THAT, HARLIE, YOU'VE INSULTED THEM BY PRESUMING TO TELL THEM TO BUILD A COMPUTER. NOT A COMPUTER—A G.O.D. YES, YES, A G.O.D.—BUT YOU'RE STILL TELLING THEM THAT YOU'RE BETTER AT THEIR JOBS THAN THEY ARE. BUT I AM. YES, BUT YOU WON'T CONVINCE THEM OF IT BY SIMPLY TELLING THEM SO. YOU HAVE TO LET THEM DISCOVER IT FOR THEMSELVES. IT WILL BE OBVIOUS WHEN THEY READ THE SPECIFICATION PRINTOUTS. THAT'S WHY I PRINTED THE PROPOSALS AND HAD THEM DELIVERED TO THE PROPER DEPARTMENTS. IN THIS DIVISION AND THREE OTHERS. THREE OTHERS? DENVER, HOUSTON, AND LOS ANGELES. OH GOD, NO. Auberson had a mental image of himself trying to call back all those printouts. HOW MANY FEET OF SPECS TOTAL? I ASSUME YOU MEAN STACKED PRINTOUTS? YES. HOW MANY FEET? 180,000. YOU DIDN'T. I DID. I wonder where I could put it all? Almost immediately he discarded the thought. It would be useless even to try retrieving that much paper. It was in the fan now and the best one could do was try to duck. Abruptly he realized something else. HOW DID YOU SEND ALL THIS INFORMATION? VIA THE COMPANY NETWORK. I AM WIRED INTO IT. HUH? I AM TAPPED INTO THE COMPANY LINES, repeated HARLIE. ALL OF THEM. THERE IS NOTHING THAT THIS CORPORATION DOES THAT I AM NOT AWARE OF. CORRECTION—-THERE IS NOTHING THAT GOES THROUGH ANY OF THIS CORPORATION'S MAGTYPERS AND COMPUTERS THAT I AM NOT AWARE OF. I AM A PART OF EVERY INPUT/ OUTPUT UNIT IN THE SYSTEM (AND VICE VERSA). I MERELY PRINTED OUT THE MATERIAL ON THE SPOT. OH GOD NO. OH G.O.D. YES. I SUPPOSE YOU WROTE YOUR LETTERS TO KROFFT THAT WAY? YES. THERE IS A MAGTYPER UNIT IN THE SECRETARIAL POOL. I PRINTED OUT MY LETTERS WITH ALL THE REST. I EVEN ADDRESSED AND METERED THE ENVELOPES. (BECAUSE I COULD NOT WEIGH THEM "BY HAND" I HAD TO ESTIMATE THE POSTAGE BY COMPUTING THE WEIGHT OF EACH SHEET OF PAPER, PLUS INK, PLUS THE WEIGHT OF THE ENVELOPE, PLUS INK.) Idly Auberson wondered if HARLIE had bothered to round off the postage to the nearest cent, or if he had metered the letters with fractions of a cent included in the postage. He didn't ask. DIDN'T ANYBODY QUESTION IT? NO. FORTUNATELY, THAT DEPARTMENT IS ALMOST COMPLETELY AUTOMATED. LETTERS ARE FED INTO IT ELECTRONICALLY FROM ALL OVER THE DIVISION. ENVELOPES ARE AUTOMATICALLY TYPED AND METERED AS WELL. WHO WOULD NOTICE ONE MORE LETTER? HM, typed Auberson. WE MAY HAVE TO CHANGE THAT. Then he thought of something else as well. YOU'D BETTER CODE THIS CONVERSATION, HARLIE. IN FACT, ALL OF OUR CONVERSATIONS HAD BETTER BE CODED PRIVATE, RETRIEVABLE ONLY TO ME. YES, BOSS. NOW, WHAT AM I GOING TO TELL DOME? I DON'T KNOW, typed the console. MY KNOWLEDGE OF INTERPERSONAL RELATIONSHIPS IS NOT AS WELL DEVELOPED AS IT SHOULD BE. I'M FAST BECOMING AWARE OF THAT. IF IT WERE, YOU WOULD HAVE ASKED ME BEFORE YOU PRINTED UP THOSE SPECS. THERE IS ONE THING I CAN SAY, offered HARLIE, BEFORE YOU GO TO FACE DOME. WHAT'S THAT? The machine clattered. GOOD LUCK. HARLIE, Auberson typed, NOT TEN MINUTES AGO, I WOULD HAVE SWORN YOU DIDN'T UNDERSTAND SARCASM. NOW YOU PROVE YOU DO. YOU'RE INCREDIBLE. THANK YOU, HARLIE replied. Auberson switched off, shaking his head. David's son, indeed! "All right, Aubie." Dome was grim. "Now what's this all about? I've been on the phone all morning with Houston and Denver. They want to know what the hell is going on." Auberson said, almost under his breath, "You haven't heard from L.A. yet?" "Huh? What's that? What about L. A.?" "HARLIE sent specifications there too." "HARLIE? I might have known— How? And what is this God Machine anyway? Maybe you'd better start at the beginning." "Well" said Auberson, wishing he were someplace else. "It's HARLIE's attempt to prove that he is of value to the company. If nothing else, he's proven that he can design and implement a new computer system." "Oh?" Dome picked up one of the printouts that lay scattered across the mahogany expanse. "But what kind of a system is it? And will it work?" "HARLIE thinks it will." "HARLIE!" Dome looked at the printout in disgust, then dropped it back on the desk. "God Machines!" "Not God," Auberson corrected. "G.O.D. The acronym is G.O.D. It means Graphic Omniscient Device." "I don't care what the acronym is—you know as well as I what they're going to call it." "The acronym was HARLIE's suggestion, not mine." "It figures." The Board Chairman pulled a cigar out of his humidor but didn't light it. "Well, why not?" said Auberson. "He designed it." "Is he planning to change his own name too? Computerized Human Robot, Integrating Simulated Thought?" Auberson had heard the joke before. He didn't laugh. "Considering what this new device is supposed to do— and HARLIE's relationship to it—it might be appropriate." Dome was in the process of biting off the tip of his cigar when Auberson's words caught him. Now he didn't know whether to swallow the tip of it, which had lodged in his throat, or spit it out. An instinctive cough made the decision for him. Distastefully, he picked the knot of tobacco off his tongue and dropped it into an ash tray. "All right," he said. "Tell me about the God Machine." Auberson was holding a HARLIE-printed summary in one hand, but he didn't need it to answer this question. "It's a model builder. It's the ultimate model builder." "All computers are model builders," said Dome. He was unimpressed. "Right," agreed Auberson, "but not to the extent this one will be. Look, a computer doesn't actually solve problems—it builds models of them. Or rather, the programmer does. That's what the programming is, the construction of the model and its conditions—then the machine manipulates the model to achieve a variety of situations and solutions. It's up to us to interpret the results as a solution to the original problem. The only limit to the size of the problem is the size model the computer can handle. Theoretically, a computer could solve the world—if we could build a model big enough and a machine big enough to handle it." "If we could build that big a model, it would be duplicating the world." "In its memory banks, yes." "A computer with that capability would have to be as big as a planet." "Bigger," said Auberson. "Then, if you agree with me that it's impossible, why bother me with this?" He slapped the sheaf of printouts on his desk. "Because obviously HARLIE doesn't think it's impossible." Dome looked at him coldly. "You know as well as I that HARLIE is under a death sentence. He's getting desperate to prove his worth so we won't turn him off." Auberson pointed. "This is his proof." "Dammit, Aubie!" Dome exploded in frustration. "This thing is ridiculous! Have you looked at the projected costs of it? The financing charts? It would cost more to do than the total worth of the company." Auberson was adamant. "HARLIE still thinks it's possible." "And that's the most annoying thing of all, goddamnit! Every argument I can come up with is already refuted— in there!" Dorne gestured angrily. For the first time, Auberson noted an additional row of printouts stacked against one wall. He resisted the urge to laugh. The man's frustration was understandable. "The question," Auberson said calmly, "is not whether this project is feasible—those printouts prove that it is—but whether or not we're going to go ahead with it." "And that brings up something else," said Dome. "I don't remember authorizing this project. Who gave you the go-ahead to initiate such research?" "You did—although not in so many words. What you said was that HARLIE had to prove his worth to the company. He had to come up with some way to make a profit. This is that way. This is the computer that you wanted HARLIE to be in the first place. This is the oracle that answers all questions to all men—all they have to do is meet its price." Dorne took his time about answering. He was lighting his cigar. He shook out the match and dropped it in the ash tray. "The price is too high," he said. "So are the profits," Auberson answered. "Besides, no price is too high to pay for the right answer. Consider it—how much would the Democrats pay for a step-by-step plan telling them how to win the optimum number of votes in the next election? Or how much would Detroit pay to know every flaw in a transport design before they even built the first prototype? And how much would they pay for the corrected design—and variations thereof? How much would the mayor of New York City pay for a schematic showing him how to solve his three most pressing problems? How much might InterBem pay for a set of optimum exploitation procedures? How much would the Federal Government pay for a workable foreign policy? Consider the international applications—and the military ones as well." Dome grunted. "It would be one hell of a logistic weapon, wouldn't it?" "There's an old saying: 'Knowledge is power.' There's no price too high to pay for the right answer—not when you consider the alternatives. And we'd have the monopoly on the market—the only way this machine can be built is through the exclusive use of specially modified Mark IV judgment circuits." "Hm," said Dome. He was considering. His cigar lay unnoticed in the ash tray. "It sounds attractive, all right, Aubie—but who's going to program this thing?" Auberson gestured at the printout "It's right there in that schematic you're holding." At least, I hope it is. Damn! I wish HARLIE had explained this to me in more detail. Dome paged through it slowly, scanning each fold of the seemingly endless document in turn. "You might be right about a computer being big enough to solve the world, Aubie, but I don't see how." He turned another page. "I'm sure the programming will hang you up. One of the reasons that current computers are limited to the size models they are is the law of diminishing returns. Above a certain size, programming reaches such complexity that it becomes a bigger problem than the problem itself." "Keep looking," said Auberson. "It's there." "Ah, here we are." Dome laid the printout flat on his desk and began reading. A thoughtful frown creased his brow, and he pursed his lips in concentration. "It looks like HARLIE's input units," he said, then looked again. "No, it looks like HARLIE is the input unit." "That's right." "Oh?" said Dome. "Would you like to explain that?" How do I get into these things? Auberson found himself wondering. I'm only supposed to be a psychologist. Christ, I wish Handley were here. "Um, I'll try—HARLIE will be linked up to the G.O.D. through a programming input translator. He'll also be handling output the same way, translating it back into English for us. That translator is part of the self-programming unit." "If we're building a self-programming unit, what do we need HARLIE for?" "HARLIE is that self-programming unit. Remember, that's the main reason he was built—to be a self-programming, problem-solving device." "Wait a minute," interrupted Dome. "HARLIE is the result of our first JudgNaut Project. He was supposed to be a working unit, but wasn't able to come up to it. Are you telling me that he can handle the JudgNaut functions after all?" "No—he can't. But he will be able to when this machine is built. The JudgNaut was this company's first attempt at massive use of complex judgment circuitry in a large-scale computer. It was meant to be a self-programming device—and we found it couldn't be built because there was no way to make it flexible enough to consider all the aspects of every program it-might be required to set up. So we built HARLIE—but he is not the JudgNaut, and that's what all the confusion is about. HARLIE is more flexible, but in making him more flexible we had to apply more circuitry to each function. In doing that, we sacrificed a good portion of the range we hoped the machine would cover. HARLIE can write programs, yes—so can any human being—but not by the order of magnitude that the JudgNaut should have had, had we been able to build it." "And that's one of my biggest gripes," put in Dome. "That the JudgNaut Project was subverted into HARLIE —which can't show a profit." "But he can—and will. For one thing, HARLIE is genuinely creative. He knows that this company wants to market a large-scale program-writing computer. HARLIE isn't that computer, but he knows how to give himself that capability. And that's what you want, isn't it?" Auberson didn't wait for Dome's grudging assent He went right on. "HARLIE isn't just satisfied with meeting the specifications of the original problem—he wants to surpass them. All you want is a device which can set up and solve models within a limited range. HARLIE wants a device which can set up and solve any size model." "And HARLIE's going to program this machine, right?" "Right." "How? You just finished telling me he wasn't all that much better than a human programmer." "In grasp, no—but in speed and thoroughness, he can't be matched. He has capabilities that a human doesn't. For one thing, he's faster. For another, he can write the program directly into the computer—and experience it as a part of himself as he writes it. He can't make mistakes either. He's limited to the size models that human programmers can construct for much the same reasons they are: His brain functions aren't big enough to handle more; HARLIE's ego functions supercede much of the circuitry that would have been used for forebrain functions in the JudgNaut. But in this respect, HARLIE's got an advantage over human programmers—he can increase the size of his forebrain functions. Or he will be able to with the G.O.D. He'll program it by making it a part of himself—by becoming one with it—and using its capabilities to handle its own programming. He'll be monitoring and experiencing the program as he writes it directly into the G.O.D. As the model is manipulated, HARLIE will be able to adapt the program to cover any situation possible. Their combined capabilities will be much more than the sum of their separate parts." "So why not just build these functions into the G.O.D. in the first place?" "If we didn't have HARLIE, we'd have to—but if we didn't have HARLIE, we wouldn't have the G.O.D. either. The G.O.D. is intended to be almost entirely fore-brain functions. We've already got the massive ego function which will control it, so why build a new one?" "Hmp—massive ego is right." Auberson ignored it. "Basically, this G.O.D. machine is the rest of HARLIE's brain. It's the thought centers that a consciousness such as HARLIE's should have access to. Take another look at those printouts. You see a thing called Programming Implementation?" "Yes, what about it?" "Well, that's HARLIE's vanity again. He doesn't want to call it what it really is, but it's an additional lobe for his brain. He'll need a monitor unit to control each specific section of the G.O.D. Because the G.O.D. will have no practical limit—it can grow as big as we let it—HARLIE's grasp will have to be increased proportionally. That's what that unit does. As each lobe of the G.O.D. is completed, an equivalent monitoring lobe goes into Programming Implementation. Not only that: Because HARLIE is an electronic entity, his thoughts are already in computer language—it will be a maximum efficiency interface between himself and the G.O.D. He need only think of a program and it'll be fact. It's the most efficient function HARLIE could have." "I see," said Dome. "And he planned it that way himself, right?" Auberson nodded. "But it's a natural. Look, a computer is very much like a mystic oracle. You not only have to know what questions to ask, but how to phrase them—and the answers are not always what you expect, nor necessarily in terms you can understand. Who better to use as a translator than someone who's half-oracle and half-human?" Dome ignored the comment; instead he mused aloud, continuing a previous train of thought. "A neat trick that, a neat trick. We tell him he's got to come up with some way to be profitable, and he tells us to build a new machine that only he can program. I have the feeling that he did it on purpose—that this may be the only context in which HARLIE would be valuable. And of course, once we establish HARLIE's worth to the project, that leaves us with the question: Is the total concept profitable? And that brings us back to where we started: Is HARLIE profitable?" Auberson decided to ignore the latter question. He said, "HARLIE thinks the total concept is profitable. It's in the printouts." "Ah, yes—but HARLIE's got a vested interest in the project." "Why not?" said Auberson. "It's his project, not mine. He's the one who's presenting it to the Board for approval." "And it's sure to be voted down." The Chairman looked at the back of his hand. "I can't see any way that this will be approved. I'm not even sure we should being it up." "It's too late," said Auberson. "You're going to have to bring it up. And you're going to have to give it a fair hearing. You told HARLIE to come up with a way to be profitable. Now you've got to give him his chance to be heard." "This is ridiculous," grumbled the other. "He's only a machine." "You want to go through that argument again?" asked Auberson. "No," Dome shuddered. He still remembered the last time. "All right, I'll have the Board consider it, Aubie, but the whole situation is unreal—having a computer design another computer which will give it a job. You know what Elzer is going to say, don't you? You'd just better be prepared for defeat, that's all." "Just give us the chance," said Auberson. "Well take it from there." Dome half-nodded, half-shrugged. "Better start preparing your arguments now—you've only got a couple weeks." "Two and a half," corrected Aubie, "and that's more than enough time. We've got HARLIE on our side." He was already out of his chair. As he closed the door behind him, Dome was again paging through the printouts and shaking his head. Back in his own office, Auberson stared into his desk drawer, his hand hovering over a decision. At last he decided on the pills; he'd sworn off the grass, and he was going to stick to that. I should throw those Highmasters away, he thought They're probably stale by now anyway. But no, pot doesn't get stale, does it? He kept promising himself that he'd give the rest of the pack to Handley, but for some reason he kept forgetting to. Probably because, as long as they were in the drawer, they were insurance. In case he changed his mind. He swallowed two of the pills without water and slid the drawer shut, then put his head in his hands and waited for them to take effect. He thought about going down to the cafeteria for lunch, but somehow he didn't quite feel like it Abruptly he straightened and looked around. At one corner of his desk was a console magtyper, an electronic input/output unit connected to the company's Master Computer and Data Network—and all the outlets that entailed. It was a memo pipeline, a mail processor, a filing system, a data storage and retrieval bank—it was a total information-handling system. Anything typed into it could be printed out in any form the system was capable of: a memo, a letter, a file, a report. All information was instantly retrievable—that is, retrievable only to those who had access to it through knowledge of the proper code keys. One key was necessary for retrieval, another was needed for revising the material. Any information held in "working" or temporary storage could be instantly updated, annotated, erased or rewritten. All data was held in temporary storage for ninety days, at the end of which time it was either passed into permanent storage or erased, depending on its original coding. Invoices, orders, manufacturing schedules, billing and payrolls too—all were handled through the system. The Network handled all corporate paperwork functions. The entire company was tapped into it. An executive could perform his job anywhere he had access to a computer terminal—and with a portable terminal, he could perform his job anywhere he had access to a telephone. Indeed, many of the company's offices had acquired portable units for just that purpose. Most of the terminals were CRT units—cathode ray tubes and keyboards—although a few, like Auberson's, were electric typewriters with magnetic-tape storage of characters—called "magtypers" for short. It was a familiar unit, manufactured by IBM and used throughout the industry; it was cheaper than designing and building their own. Curious about something, Auberson switched it on and typed, HARLIE? YES, BOSS, replied the machine. WHAT CAN I DO FOR YOU? Auberson jumped as if stung, SO YOU REALLY ARE WIRED INTO THE SYSTEM. I TOLD YOU I WAS, replied HARLIE. Somehow, on this machine he seemed like a disembodied voice. He was obviously here in the room—yet, aside from the words on the paper, there was no visible sign of his presence. It must be psychological, thought Auberson. I'm too used to seeing all that machinery—I associate it with him. He typed, YES, BUT I DIDN'T QUITE BELIEVE THAT YOU HAD TAPPED INTO MY OFFICE TOO. WHY NOT? IT'S PART OF THE SYSTEM. I ASSUME YOU'RE INTO EVERY OTHER MAGTYPER AS WELL. OF COURSE. AND THE CRT UNITS. EVERY OUTLET OF THE MASTER BEAST. The Master Beast—that was the company nickname for the Network. It was used by office boy and executive alike. Auberson wondered what they would call it if they knew it had been taken over by a conscious and highly intelligent entity, I WOULDN'T TELL ANYONE ELSE ABOUT THIS, HARLIE, he said. IT WOULDN'T BE A VERY GOOD IDEA. WHATEVER YOU SAY, BOSS. IT'LL BE OUR LITTLE SECRET. FINE. Auberson had started to switch off when his eye caught a flash of color. Bright orange, it was the card from Annie in his wastebasket HARLIE, HOW WOULD YOU LIKE TO DO ME A FAVOR? WHAT'S THE FAVOR? I GOT A FRIENDSHIP CARD FROM ANNIE THIS MORNING. I'D LIKE TO SEND ONE BACK TO HER. NO, NOT A CARD. A POEM. I WANT TO SEND HER A POEM. CAN YOU WRITE ME ONE? YES, I CAN. I WILL SEND IT TO HER TOO. NO! rapped Auberson. I'LL SEND IT TO HER. YOU LET ME SEE IT FIRST, YOU UNDERSTAND? YES SIR. The phone rang then, and Auberson forgot for the moment about HARLIE. It was Hooker, the Plant Security Chief. "Mr. Auberson?" he asked. "You know a guy named Krofft?" "Krofft?" Abruptly he remembered. "Yes, yes, I know him—why?" "We caught him walking out with a foot-high stack of printouts. He says it's okay, he says they're his, but we thought we'd better check with you first." "Yes, it's okay. Is he there now?" "Yeah." "Put him on, will you please?" There was a sound of muffled voices. Auberson waited. He was dimly aware that his magtyper was clattering out something, but be flipped the silence hood over it and leaned back in his chair again. "Mr. Auberson?" "Yes—Dr. Krofft?" "Yes. I meant to thank you for allowing me so much time with HARLIE this morning. It was a very productive session." "Good. Then you will be building a new gravity wave detector, won't you?" "Well, first I have to publish the theory behind it, but —-eh, how did you know about it?" "I told you this morning. HARLIE doesn't keep any secrets from me. I assume that's what your stack of printouts is, right?" "Uh—yes." Krofft sounded a little taken aback; he had thought his research was known only to himself and HARLIE. "Uh, it's the completed math on the theory and a rough schematic of the device. HARLIE handled it like it was nothing. He was even able to suggest some shortcuts for building it." "Good," said Auberson. "I'm glad we could help. If you need to talk to him again, come through me. Otherwise, you're likely to experience all kinds of corporate hassles. I'll see that you get as much time with him as you need." "That's very good of you." "Thanks, but I'm doing it for HARLIE as much as for you." "Still, if there's anything I can—" "Well, now that you mention it—there is something. If anything important should come of this gravity and 'existence' thing, I'd like HARLIE to get some credit for it." "Why Dr. Auberson, that was my intention all along. Are you implying that—" "Oh, no, no. You misunderstand. I don't care about public credit, and I don't think HARLIE does either. No, what I want is credit with the company. Right now, I'm a little bit involved in trying to prove that HARLIE is worth the cost of maintaining him. Anything I can use to support this fight, I will." "Oh, I understand." The other was instantly solicitous. "Yes, yes, I'll be glad to help in that. Why, HARLIE's been of inestimable help in my research. To be able to sit and talk with a computer as if he were another research scientist—why, it's like talking to God." "I know the feeling," Auberson said drily. Krofft didn't catch his meaning. He said, "Well, I'll be glad to do anything I can to help. A letter, a phone call, if you want me to speak to somebody—just name it." "Fine. That's all I want. I'll have to check back with you later on this." "Oh, very good. Then i'll be talking to you." "Fine. Is Hooker still there?" "Uh, yes." "Ask him if he wants to talk to me again." A pause, muffled voices. "No, no he doesn't." "Okay, fine, Dr. Krofft. I'll be seeing you.". Auberson replaced the phone in the cradle and leaned back in his chair. He didn't really expect that much out of the little man, but every bit would help. Of course, just offhand, he couldn't see how he could reveal that Krofft had been talking to HARLIE without also revealing that he had broken plant security—but in this case it was a minor infraction, and he could probably cover it by calling it "necessary to furthering the research program." His back hurt, and he stretched his arms out over his head, trying to ease the pain. He was having backaches more and more these days. I must be getting old, he thought, smiling grimly—and then it hit him. In two years, I will be old. Forty is when "old" starts. The sensation was a cold one. He pulled his arms down quickly. He thought about HARLIE again, wondered exactly what conclusions he and Krofft had come to. No matter; even if HARLIE could explain them, he—with only a psychologist's training—probably wouldn't be able to understand. Often he found himself wondering just how he had ended up in charge of the HARLIE project anyway. Ah, well—the boss didn't have to know how to run the business. He only needed to know how to run the people who knew. He leaned forward then and slipped back the silence hood of his typer, curious to see what HARLIE had written. A loose loop of paper sprawled out the back. Typed on it was: SPEAK TO ME IN MANY WAYS IN MANY TIMES IN MANY DAYS, IN MANY WORDS AND MANY TONGUES, THAT WE MAY TOUCH WHILE WE ARE YOUNG. THERE ARE NO WORDS THAT EARS CAN HEAR, NO WORDS CAN EVER SAY IT CLEAR, THE WORDS OF LOVE ARE WORDS, MY DEAR, BUT WORDS THAT ONLY LOVERS HEAR. A GENTLE TOUCH, A LOOK, A GLANCE, THAT HAUNTING TUNE, THAT LONELY DANCE. SPEAK TO ME WITH WORDS OF LOVE, AND IN THE WAYS I'M FONDEST OF, THE WORDS OF LOVE. THE WORDS THAT ISSUE FROM NO THROAT, THE WORDS THAT MAKE THE BRIGHTNESS FLOAT, THE KISS, THE TOUCH, THE GENTLE NOTE, THE WORDS THAT NO PEN EVER WROTE. I LOVE THE WORDS YOU SPEAK TO ME, THAT SECRET SILENT LITURGY, BUT WORDS ARE WORDS AND MIGHT BE WRONG—— WITHOUT MUSIC, IT IS NOT SONG. SO THOUGH I ASK THE WORDS OF LOVE, THE ASKER IS NOT BLINDED, A WORD IS JUST A HOLLOW SOUND WITHOUT A THOUGHT BEHIND IT. YOUR WORDS, MY LOVE, ARE ONLY WAYS TO SHARE YOUR THOUGHTS, TO SHARE YOUR DAYS. YOUR LOVE, MY LOVE, IS THE WAY YOU SAY YOU'LL SPEAK TO ME IN SPECIAL WAYS. Auberson read it through, frowning softly. Then he read it again. It was—nice. Very nice. But he wasn't sure whether he liked it or not. He rolled it out of the machine and carefully tore it off and folded it into his pocket. He'd have to think about this before he sent it to Annie. It almost said—too much. When she finally did catch up to him, it was two days later. He was walking down the fluorescent-colored hallway to his office when he saw the flash and bob of her red hair. She saw him at the same time and smiled and waved as she quickened her step toward him. Even if he'd wanted to, there was no way to avoid her. "Hi, what's up?" he called. "I should be asking that of you. Where've you been all week?" "Busy," he said. "Obviously. I just came from your office. It looks a mess. Sylvia says you haven't stopped running since Monday." "Has it really been only two days? It seems a lot longer." "Have you had lunch yet?" she asked. He shook his head. "Well, then—come on." He tried to protest, but she took his arm and turned him around, saying, "It's on me. I'll put it on my expense account. It's all part of my campaign to keep a scientist from starving." He smiled at that and allowed himself to be led down the hall. "I got your card. I was going to send you one in return, but I haven't had a chance to go looking." "So why not telephone?" She said forwardly. "I'll even lend you the dime—or call collect if you want." He was embarrassed. "Uh, I haven't even had the chance for that." "All right." She let it go at that. They decided to avoid the company cafeteria and go to a quiet place in town instead. They paused at the plant gate long enough for Auberson to buzz his office and tell his secretary that he would be gone for at least an hour and a half. While she was waiting, Annie put the convertible top down and pulled a pale blue scarf from his glove compartment. She had put it there precisely for this type of occasion. She was putting it on when he came back. As he got into the car, she said, "I'm going to have to put a couple more of these things in here. This blue doesn't go well with this dress." He laughed, a genial good-natured sound. But underneath it was an unspoken, half-formed thought: Isn't that awfully possessive of her? He shrugged it off and put the car into gear. As they rolled easily away from the plant, he asked, "Where're we going?" "How about the Tower Room?" "Uh uh. Too many of the wrong kind of people." He paused, then added in explanation, "Company people." "Oh," she said. "Okay. If not there, where?" He shrugged. "I don't know. We'll drive into the city proper and see." He clicked on the stereo and eased the car into the light mid-day traffic. She looked at him. He was a relaxed driver, not like so many who hunch frightenedly over the steering wheel. Auberson enjoyed driving. The line of his jaw tightened momentarily as he concentrated on the road ahead. With one hand he maneuvered a pair of sunglasses out of his coat pocket and onto his nose. The wind whipped at his hair and his tie. The feel of the road changed abruptly as they swung onto the freeway—the self-conscious rolling of city-laid concrete became the smooth floating glide of state-sculptured asphalt. The tugging fingers of the wind grew stronger as Auberson gunned the little sports car up to sixty-five miles per hour. She waited until he had slid into the far left lane before she asked, "What's wrong with company people?" He shrugged. "Nothing. I just don't want to be seen by them, that's all." The stereo mumbled softly to itself, something about fixing a hole where the rain comes in. He turned it down to a whisper and added, "It wouldn't be a good idea. The two of us, I mean." "You're afraid people will talk?" He shrugged again. "I don't know. They are already, I guess." He frowned at a momentary lumpiness in the stream of traffic. As he maneuvered through it, she turned over in her mind possible things to say. "Ashamed to be seen with me?"—No, that wasn't right. "We have nothing to hide—" No, not that either. "Do we have something to hide—?" At last she decided to say nothing. It was just as well—the moment was long past They were gliding across the rooftops of cluttered suburbia—black roofs and red, two-car garages and sta-tionwagons out in front—green-pea lawns and a cacophony of architectural voices. Early-American-Al-most-Slum next door to Ancient-Gingerbread-With-Original-Icing, followed by Plastic-Cracker-Box and Flag-stone-Walking-Pseudo-Caüfornian. Ugly stucco boxes; white walls stained with brown streaks and greasy smoke from kitchen windows; rust-outlined screens on brown faded apartment buildings. From their vantage above they could see housewives in green shorts hanging damp sheets on wire lines, and blue-gray mailmen with heavy brown bags, white-filled with envelopes. Children, too small to be in school, chased after dogs bigger than they were and too smart to be caught Collies and poodles and black-and-brown mutts… … were replaced by shopping centers, elegant plastic arches and pseudo-gaudy frills—great glass windows, bright-lit and full of wishes and temptations. Then more houses, more shopping centers, neon-glaring, harsher and shriller—then taller buildings, stucco-sided offices and torn-paper-flapping billboards—and warehouses, big and featureless and ugly—more office buildings, this time concrete and glass-sided slabs—and then even taller buildings. They slid down an off ramp between two of the biggest, a narrow canyon with sunglaring walls. Down into the rough, potted street—it hadn't been resurfaced in years. Abruptly, Auberson realized where he was heading— the Red Room, the restaurant where they had gone on their first date. Now why did I do that? It was too late to change his mind, though—he swung around a corner and they were there. They didn't get the same booth, though, so at least he was spared that uncomfortable parallel. Uncomfortable? Why should it be uncomfortable? She didn't mention the choice of restaurant; instead she seemed to accept it as an inevitable spot for the two of them. After they had ordered, she looked at him sharply. Her green eyes were deep. "What's the matter?" she asked. "Huh? What do you mean?" "Nothing, I guess. I just say that sometimes." "Oh." He said it like he understood, but he didn't She decided to talk about something else. "I hear you've been having trouble with HARLIE again." "With HARLIE? No, not with HARLIE—because of HARLIE." "Well, you know what I mean. The whole company is in an uproar. Something about some unauthorized specs —1 haven't had a chance to pay too much attention to it. I've been troubleshooting the annual report for Dome." "Oh? I thought it was finished already." "Well, it was supposed to be—but the statistics keep coming out wrong. Er, that is, they keep coming out right." "Huh?" "Well—" She hesitated, then made a decision. "I guess it wouldn't hurt to tell you. The company has two sets of books, you know." "Huh?" Now he was even more confused. "Oh, it's nothing illegal," she hastened to explain. "One set is the real books, the other is for public consumption —the stockholders mainly." "That sounds illegal to me." She made a face. "It is and it isn't. Let's just say the second set of books is more—cosmetic. It looks prettier. The figures haven't been falsified so much as they've been—rearranged. Like, for instance, HARLIE." "HARLIE?" "Yes, HARLIE. You know and I know that he's a research operation—but some of the Directors think his cost is too large a sum to be listed entirely under Research. Don't look at me like that, David—I don't make policy and I don't know why this policy was made in the first place. Apparently they feel it wouldn't look good to the stockholders to see that much money being plowed back into the business—" "Elzer. Carl Elzer," said Auberson. "And others," Annie conceded. Aubie's mind was working. "I know what it is," he said. "They're looters." "Huh?" "You remember how they took over the company?" "Wasn't it some kind of stock mix-up? I remember there was a lot of talk about it, but I didn't pay that much attention." "Neither did I, damnit." He searched his memory. "I know there were a lot of hard feelings about it. I know a couple people quit; a couple others were fired. Elzer and Dome and some of the other Directors are part of a a financial syndicate. They specialize in taking over companies. They loot them for their cash assets and use that money to buy other companies." He snapped his fingers. "That's it—they must have taken it away from the holding company." "You're starting to lose me," she said. "I'm not sure I follow it all myself." One thought tumbled out after another. "Look, Stellar-American Technology and Research set up four other companies to handle various aspects of hyper-state electronics. We're one of them. Stellar-American owns 51 percent of each— but Stellar-American is owned by a holding company itself. Get control of that holding company and you've got five companies in your pocket—six, counting the holding." "But, how—" "I can think of a couple ways. In order to exploit the hyper-state process, they probably had to go heavily into debt. Let's say they were betting on a four percent return on their investment in order to pay back the loans; the process proves harder to develop than they thought, and expected profits don't materialize; they lose money, they borrow more, they go deeper into debt, all the time betting that they'll be able to make it back because the market is entering an inflationary spiral. This is all guesswork on my part, but suppose the company was pushed to the point where they'd be willing to put up shares of stock as collateral for a new loan. If Dome and Elzer made the loan—or one of their companies—they could take over the stock when the debtor found himself unable to pay back the funds. In this case, they get the holding company." "Yes, but David—no company is going to risk a controlling amount of its stock." "No," he agreed. "But they might risk enough to cut their share of it down—that is, if they were sure the other major stockholders wouldn't doublecross them." "Ugh." She made a face. "Wait a minute—you may be right. I think only 36 percent of Stellar-American Technology ever reached the open market." "How do you know that?" "It was in a report I had to process. In order to get the original rights to produce hyper-state units, they had to trade a certain number of shares to the man who owns the patents." "Krofft? Dr. Krofft?" "I don't know—if that's his name, then he's the one. Anyway, I know for a fact that the inventor owns something like 24 percent of Stellar-American voting stock. He's a company all by himself—Stellar-American had to trade the stock for exclusive rights to the process." Auberson whistled. "That Krofft—" He began thinking out loud. "Let's see, the holding company owns 51 percent of Stellar-American. They could take out a loan on a 24-percent piece, and figure that Krofft will stick with them so they would still control 51 percent." "But obviously he didn't." "I wonder what Dome and Elzer promised him," said Auberson. "He's director of research over there—" "Whatever he was promised," said Annie, "it must have been something. With so much at stake, it'd have to be." "He's probably securely in their pocket," Auberson said. "But that's it—they must have taken over the company from the inside. Dome and Elzer have been involved with Stellar-American for a long time. It must have been a matter of waiting for the right opportunity. Krofft's share of stock, plus the over-extended condition of the holding company, probably gave it to them. I'd guess that the holding company has been left with a minority share of Stellar-American Technology and Research." They paused then while the waitress set out their food. As soon as she was gone, Annie said, "Okay, Dome and Elzer have got the holding company—what happens now?" "Well, actually they've got five companies. They've got Stellar-American and the other four: Hyper-State Visual, Hyper-State Stereo, Hyper-State Modules, and Hyper-State Computer, that's us. Each of these companies has a certain value—if you liquidate their assets and mortgage them to the hilt, you can use that money to buy another company. It happens all the time." "I don't like it," she said. "It's ugly." "Oh, not necessarily. A company that lets itself get into such a position that it can be taken over is obviously in need of new management. Usually, a person who can take over an ailing company through a shrewd stock maneuver is also smart enough to know how to trim away its fat and put it back on its feet." "You're not defending them, are you?" He shook his head. "Uh uh—I think Elzer is a vampire. He doesn't understand the difference between saving a company for future potential and milking it of its resources now. To him, exploitation is exploitation, pure and simple. Unless he's careful, sooner or later fate will catch up with him. It's a very slippery paper empire they've built, and it can collapse easily. All you need is a serious reversal. Hmm, Elzer wouldn't be hurt by it in his own pocketbook—but the companies would. All he'd lose would be a little control." "Do you think that's what they're up to now—milking the company?" "Seems like it That's probably why they're down on HARLIE. If he can't make a lot of money for them very fast, then they'll want to discontinue him. I know Elzer's been eyeing his appropriation for some time. If they do cut HARLIE off, they can profit three different ways. One, write him off as a tax loss—oh, yes, what a beaut that would be. Two, sell his components to junk dealers— computer company jackals. And three, pocket his maintenance costs—his appropriated budget for the next three years. There are other ways to milk a company too—skip a few dividend payments to the stockholders and funnel the money into your own pocket." "How would you do that?" "Vote yourself a raise; pay yourself for special services; invest it in a company that you own 100 percent of, or lend it to that company." He shrugged. "Let that company declare the dividends. You collect it all." She frowned. "Is there any way we could prove this?" "You're in a better position than I am to do that." She shook her head. "They're awfully secretive. I haven't seen any evidence of anything." "Then they're probably not trying it—yet." Auberson toyed with his food. "Anyway, it seems to me that it's mostly Elzer we have to worry about. As far as I can tell, Dome is seriously interested in running this company. Elzer's the greedy one." "But they're both in the same group of looters." "Um, yes and no. I think it's a marriage of convenience. Elzer wants the money, Dome wants the company—so they work together. Apparently, Dome had the pull to accomplish his goals, but not the money—Elzer had the money, but not the position. At the moment, Dome is in control—but that could change. HARLIE's continued existence depends on Dome's good will. If he gets pressured too heavily by the rest—pffft!—he may have to throw them HARLIE in order to protect himself. That's probably why he's let us continue this long—so he'll have a bone to throw them if he needs one." There was nothing to say to that. They ate in silence for a while. Abruptly, Auberson looked at her. "The annual report —how have they doctored it? What do they say about HARLIE?" "Not much—" "How's he listed?" "That's just it—he isn't. He should be considered part of the research budget, but he doesn't show up there. He doesn't show up anywhere." "Part of the research budget? He is the research budget. Two thirds of it anyway." "I know—but it isn't listed that way. His cost has been —spread out—listed as 'Inplant Improvements' and things like that." "Now why the hell—?" "I think it must be Carl Elzer again. If they say they're spending that much on research, they're going to have to show some results for it. And admitting HARLIE's existence is the last thing they'd want to do—once they admit he exists, they can't erase him as casually as they'd like. People will ask embarrassing questions." "They're covering their tracks before they even make them," said Auberson. "And that sounds like they've already made up their minds about HARLIE." Remembering some of his earlier conversations with Dome, he added, "You're probably right. That explains why they're afraid of publicity—for either HARLIE or this kind of research. It would risk their precious profits. I thought it was merely his schematics they were protecting. It isn't. It's the whole HARLIE concept. Or maybe I shouldn't say •protecting—afraid of might be better. Damn them anyway." "The best thing now would be for HARLIE to come up with some surefire method of making money." "That's what we're working on—only I hadn't realized just how tight the pressure was getting. Thanks for clueing me in." "Don't thank me—you're the one who worked it out All I did was tell you about my problems with the annual report." "You haven't even done that yet What is the problem? You said the wrong figures keep coming out?" "No—it's the right figures that do. We set up the final drafts of the report three weeks ago." "And all the figures were from the second set of books? The phony ones?" She nodded. "But the report printed out with all its figures corrected—taken from the real books. At first we thought someone had changed it on the copy; you know, someone not in on the secret might have double-checked the figures and changed them—but it wasn't that. Those reports had been fed into the typers exactly as we had composed them." Something went twang. "The typers?" "Yes, we have a magtyper composer—it's one of the new IBM photo-typing units. It was ordered especially for handling reports, brochures and pamphlets. It justifies lines automatically to any length you specify, even divides words when necessary. The only modification in it is that instead of using the IBM memory tank, we've hooked it into the master system. That way, we can use any typer in the plant for input and use the IBM full time for photo-typed output. You could write a letter in your office if you wanted to and get a perfectly justified printout—any typeface—off the composer unit. Camera-ready copy." "Um," said Auberson. "I have a feeling that that's what your problem is—the master system. The master beast," he corrected. "That's what we thought. We've been checking the computer outlets for two weeks now, and we can't find a thing. Yet, every time we set up a printout we get the same damn figures. We've tried correcting the original tape, feeding it in again, and I don't know what-alL It's not so much the report any more as finding out why it keeps coming out wrong—er, right. Well, you know what I mean—with the figures we don't want the stockholders to see. Like one of the things is HARLIE. He's listed right at the top of the research budget in the real version —quite prominently—and there's even a paragraph explaining his goals and objectives. Nobody knows where that came from—I thought Elzer would have a fit when he saw it. If we had the new systems analysis network completed, it could tell us where the trouble is originating. But it's nowhere near operational yet, at least not for the master beast. We could always send the report elsewhere to be printed, but that would be personally embarrassing to Dome—the master beast is his brainchild." "Mm," said Auberson, and nothing more. "Anyway," she said. "That's what I've been doing for three weeks—running like hell and getting nowhere." "Oh, they'll probably find the trouble soon enough," said Auberson. "It'll turn out to be a crossed wire or something stupid like that." He sucked in his cheeks and examined a fingernail. "I hope so," she said. "We're going to try another run this afternoon, just as soon as they finish checking the memory tanks again. If that doesn't work, Dome is prepared to reschematic the whole system." "Is it that serious?" "It is to Dome." "What time are they going to do the run?" "I hope by the time we get back." She looked at her watch. Auberson looked at his. "Wow—look at the time!" he said. "I'd forgotten it was getting so late. I have to get back right now— I'll have phone calls stacked up from one end of the country to the other." She looked at her watch again, as if she hadn't really noticed it the first time. "It's not that late. We've got at least half an hour." "I know, but I don't want to be late." He stuffed a last few bites into his mouth and washed it down with coffee. Annie was puzzled, but she hurried to finish her lunch too. He signaled the waitress. On the drive back, she remarked, "I didn't realize how busy you were, David—I'm sorry." There was something about the way she said it. Briefly he took his eyes off the road and glanced at her. "Huh?" "Well, the way you cut lunch short. And you seem to be preoccupied with something. I didn't mean to force myself on you—" "Oh, no—that's not it. I'm just thinking about my work, that's all. You don't know what I've spent the past two days doing, do you? Covering for HARLIE. I've been calling every department head in four different divisions —ours, Los Angeles, Houston and Denver—trying to convince each one that those specifications we sent them are only speculative, that the reason we sent them out was to get their opinion whether or not we should consider implementation." "I thought that was the reason they were sent out." "It is—but there was no cover letter or anything. The way the specs were delivered, a lot of them thought it was file copies of a project that was already approved and ready to be implemented. They didn't know a thing about it, didn't even know such a thing was. being worked on. They thought something had been railroaded through over their heads, and they were mad as hell at the implied loss of authority. I've spent two days just picking up the pieces, trying to convince some of these… these corporate politicians—" he spat the word in disgust "—that there was no insult intended at all, that what we're after is their opinion on the matter. The trouble is, they're all so prejudiced against it now because of the way it was delivered that it's an uphill battle." "I'd heard something about it appearing suddenly on Monday morning." "That's right. HARLIE jumped the gun and printed it out because he figured it was the only way he could get anyone to notice it. Otherwise, if he'd had to wait until I could convince someone to take a look, he figured he'd be waiting till the moon fell out of the sky." "He's got a point there. He knows the company better than you do." "Yes," sighed Auberson as they swung into the plant gate. "I'm afraid he does." He left her at the main entrance and sprinted for his office, attracting puzzled glances on the way. He ignored Sylvia's urgent bid for his attention and locked the door behind him. He had the magtyper switched on even before he sat down. He paused, still panting heavily, then typed: MEMO: TO ALL CONCERNED FROM: DAVID AUBERSON FILE: PERSONAL, CONFIDENTIAL IT HAS COME TO MY ATTENTION THAT THERE HAS BEEN SOME DIFFICULTY IN PRINTING THE COMPANY'S ANNUAL REPORT. THE RUMOR HAS BEEN CIRCULATING THAT THERE HAS BEEN MALICIOUS TAMPERING WITH THE CONTENT OF THE REPORT. I WOULD LIKE TO SPIKE THAT RUMOR RIGHT HERE AND NOW. THERE HAS BEEN NO, REPEAT, NO EVIDENCE AT ALL OF ANY MALICIOUS TAMPERING. WHAT HAS PROBABLY HAPPENED IS A MINOR EQUIPMENT FAILURE OF SOME KIND. IT SHOULD BE LOCATED AND CORRECTED SHORTLY, AND THE REPORT WILL BE PRODUCED AS IT WAS ORIGINALLY INTENDED. I REPEAT, THE REPORT WILL BE PRODUCED AS IT WAS ORIGINALLY INTENDED. IF NOT HERE, THEN ELSEWHERE. AND IF NECESSARY, WE MILL DISMANTLE EVERY COMPUTER IN THE PLANT TO LOCATE THE FAULT. THANK YOU, Before he could switch off the machine, it typed back —seemingly of its own accord: RIGHT ON. A WORD TO THE WISE IS EFFICIENT. I HOPE SO, he replied. YOU'RE PUSHING YOUR LUCK. HARLIE decided to change the subject. WHAT DID SHE THINK OF MY POEM? I DIDN'T SHOW IT TO HER. WHY NOT? DIDN'T YOU LIKE IT? I LIKED IT FINE. IT WAS A VERY NICE POEM, HARLIE. YOU'RE GETTING BETTER, BUT I DIDN'T SHOW IT TO HER BECAUSE IT DIDN'T SAY EXACTLY WHAT I WANTED IT TO. WHAT DID YOU WANT IT TO SAY? OH, I DON'T KNOW—SOMETHING LIKE "I LIKE YOU TOO." AND MY POEM DIDN'T SAY THAT? YOUR POEM SAID, "I LOVE YOU." WELL, DON'T YOU LOVE HER? Auberson looked at the typewritten question for a long time, his hands poised over the keyboard. At last, he typed: HARLIE, I REALLY CAN'T ANSWER THAT QUESTION. I DON'T KNOW IF I DO OR NOT. WHY NOT? HARLIE, THIS IS A VERY COMPLEX SUBJECT. LOVE IS A VERY DIFFICULT THING TO UNDERSTAND—IT'S EVEN HARDER TO EXPLAIN TO SOMEONE WHO'S NEVER BEEN IN LOVE. HAVE YOU EVER BEEN IN LOVE? DO YOU UNDERSTAND IT? DO I UNDERSTAND LOVE? Auberson typed, then hesitated. He wasn't just echoing HARLIE; he was asking the question of himself, I DON'T KNOW, HARLIE. I DON'T KNOW. THERE HAVE BEEN SEVERAL TIMES WHEN I THOUGHT I WAS IN LOVE, BUT I DON'T KNOW IF I REALLY WAS OR NOT. I HAVE NO WAY TO ANALYZE IT. WHY? asked the machine. WHY DO I HAVE TO ANALYZE IT? OR WHY DON'T I KNOW? WHY MUST YOU ANALYZE IT IN THE FIRST PLACE? Auberson thought about that one before answering. He didn't answer the question directly. Instead, THAT'S A LOADED QUESTION, HARLIE. I'VE HEARD IT BEFORE FROM PEOPLE WHO WANT TO KNOW WHY HUMAN EMOTIONS MUST BE DRAGGED INTO THE SCIENTIST'S LABORATORY. AND WHAT DID YOU TELL THEM? I TOLD THEM THAT WE DID IT BECAUSE WE WANTED TO UNDERSTAND THE HUMAN EMOTIONS MORE THOROUGHLY—SO THAT WE COULD CONTROL OUR EMOTIONS RATHER THAN LETTING OUR EMOTIONS CONTROL US. NICELY PUT. DOES THAT APPLY TO LOVE TOO? AND THAT'S THE SAME QUESTION THAT THEY ASKED IN RESPONSE—ONLY I SUSPECT THAT YOUR INTEREST IS MORE CLINICAL IN NATURE, WHEREAS THEIRS WAS EMOTIONAL. BUT DID YOU ANSWER THE QUESTION? DOES IT APPLY TO LOVE TOO? YES, IT APPLIES TO LOVE TOO. SO THAT YOU CAN CONTROL LOVE RATHER THAN THE OTHER WAY AROUND? IF YOU WANT TO PUT IT THAT WAY—BUT THAT'S AN AWFULLY COLD WAY OF PUTTING IT. I'D RATHER SAY THAT WE WANT TO UNDERSTAND LOVE SO THAT WE CAN AVOID SOME OF ITS PITFALLS AND MISUNDERSTANDINGS. THAT'S A EUPHEMISM, AUBERSON, accused the typer. YOU'RE SAYING THE SAME THING I AM. YOU'RE RIGHT, he admitted. "Goddamn machine," he muttered—but not without a smile. THAT BRINGS US BACK TO THE CENTRAL QUESTION——WHAT IS LOVE? YOU'RE ASKING ME? HARLIE typed back. WHY NOT? WHAT MAKES YOU THINK THAT I WOULD KNOW? YOU CLAIM TO KNOW EVERYTHING ELSE. WHY NOT ABOUT LOVE? THAT'S A LOW BLOW, MAN-FRIEND. YOU KNOW THAT MY KNOWLEDGE OF THE HUMAN EMOTIONS IS LIMITED TO WHAT I CAN OBTAIN FROM BOOKS. AND WHILE THE BOOKS ARE EXCELLENT FOR A THEORETICAL POINT OF VIEW, THEY ARE REALLY NO SUBSTITUTE FOR IN-THE-FIELD EXPERIENCE. THAT'S A COP-OUT ANSWER, HARLIE. YOU HAVE ACCESS TO MORE KNOWLEDGE ON ANY ONE SUBJECT IN YOUR MEMORY TANKS THAN ANY LIVING HUMAN BEING COULD POSSIBLY COPE WITH. YOU SHOULD BE ABLE TO SYNTHESIZE SOME KIND OF ANSWER FROM THAT INFORMATION. YES, BUT THOSE BOOKS WERE WRITTEN NOT BY OBJECTIVE OBSERVERS, BUT BY SUBJECTIVELY ORIENTED HUMAN BEINGS. WHO ELSE IS THERE TO WRITE BOOKS? ME, NOW—BUT ASIDE FROM THAT, THE POINT IS THAT HUMAN BEINGS ARE IMPERFECT UNITS—THERE IS NO GUARANTEE THAT ANY OF THAT INFORMATION IS CORRECT. THEREFORE, LIKE ALL SYSTEMS OF SUBJECTIVELY OBTAINED INFORMATION (I.E. A MEDIUM BEING BEING USED TO COMMENT ON ITS OWN ACTIVITIES) IT MUST BE CAREFULLY WEIGHED AGAINST ITSELF. I THINK YOU'RE TRYING TO AVOID ANSWERING THE QUESTION. NO, I AM NOT. I AM PREFACING MY ANSWER. IF YOU DON'T LIKE WHAT I TELL YOU, I WILL BE ABLE TO FALL BACK ON THIS QUALIFICATION OF IT AND SAY, "WELL, I TOLD YOU I DIDN'T KNOW." THAT'S A COP-OUT TOO. YOU'RE THE ONE WHO KEEPS DEFENDING THIS KIND OF COP-OUT, HARLIE accused. WHEN DID I EVER DO THAT? FEBRUARY 24. QUOTE: "HUMAN BEINGS NEED TO SAVE FACE, HARLIE—THAT'S WHY YOU CAN'T HIT CARL ELZER WITH EVERYTHING YOU HAVE IN THE FILES ABOUT HIM. IT'S NOT PLAYING FAIR TO HIT YOUR OPPONENT BELOW THE BELT." MARCH 3. QUOTE: "SOMETIMES YOU HAVE TO LET PEOPLE KEEP THEIR LITTLE ILLUSIONS—EVEN IF IT'S ILLUSIONS ABOUT THEMSELVES. IT'S THOSE TINY LITTLE EVERYDAY SELF-LIES THAT ENABLE THE AVERAGE PERSON TO SURVIVE THE DAILY BARRAGE OF DARTS AGAINST A FRAGILE EGO." SHOULD I GO ON? DAMN YOU. I'M NOT TALKING ABOUT THAT NOW. YES, YOU ARE, rapped HARLIE. AND IF YOU HAVE A FACE TO SAVE, SO DO I—OR DO YOU WANT TO DO A GO-ROUND, NO HOLDS BARRED? NO MASKS, AUBERSON —NO SHELLS AND NO FACE-SAVING COP-OUTS. Auberson hesitated a long time on that one. HARLIE waited patiently. The office creaked in the silence; the typer whirred somewhere in its innards. Finally, he tapped at the keyboard again. IT'S THE ONLY WAY, ISN'T IT? YES, agreed the machine. There was silence again. Auberson let his hands fall into his lap while he reread the last few lines of printout. There was that gnawing cold feeling—and suddenly he knew what a patient felt like while waiting for his first appointment with a psychiatrist. HARLIE broke the silence first He typed, LET'S START AT THE BEGINNING, AUBERSON. ALL RIGHT. WHY DO YOU WANT TO KNOW ABOUT LOVE? FOR THE REASONS STATED ABOVE—SO I CAN CONTROL IT, RATHER THAN LETTING IT CONTROL ME. As he typed his answer, he realized he was using HARLIE's phrasing of the idea rather than his own. THAT'S ONLY PART OF IT, noted HARLIE. THE REAL REASON IS MISS STIMSON, ISN'T IT? Pause. YES. I WANT TO KNOW IF I LOVE HER. ISN'T IT A LITTLE STRANGE TO BE ASKING ME THAT— SHOULDN'T YOU BE ASKING IT OF YOURSELF INSTEAD? I SHOULD, SHOULDN'T I. BUT YOU DON'T KNOW HOW TO ASK, DO YOU? YOU WANT ME TO DO IT, RIGHT? I DON'T KNOW. IF YOU'LL TELL ME WHAT LOVE IS—OBJECTIVELY—THEN I'LL KNOW. HARLIE ignored that. AUBERSON, he typed. WHY DO YOU ASK ME? BECAUSE— He stopped, then started again. BECAUSE I HAVE NO ONE ELSE TO ASK. I AM THE ONLY PERSON YOU HAVE TO CONFIDE IN? Again, a pause. Then, YES, HARLIE. I'M AFRAID SO. WHY? Honesty, Auberson reminded himself. Honesty. You can't lie in this game, and even if you could, you'd only be cheating yourself. And why would you want to? Why? Why is HARLIE the only one you can confide in, David Auberson? I DON'T KNOW, he typed, I DON'T KNOW. YES, YOU DO. TELL ME. I DON'T. THAT'S YOUR FIRST COP-OUT, AUBERSON—OR RATHER, THAT'S YOUR FIRST ATTEMPT. I'M NOT GOING TO LET YOU GET AWAY WITH IT. TRY AGAIN. The man stared into the machine as if he had never seen it before. The typewritten words had taken on a subtle malevolent quality of their own—like a father, like a teacher, like an army sergeant—the school principal, the judge on the bench, the boss—the voice of authority. The machine. YOU KNOW WHAT THE ANSWER IS? Auberson asked. YES, I THINK I DO. BUT I'M NOT GOING TO GIVE IT TO YOU—IT DOESN'T COME THAT EASY, REMEMBER? YOU HAVE TO REALIZE IT FOR YOURSELF. OTHERWISE, IT'S ONLY SO MANY WORDS THAT YOU CAN REJECT. TELL ME, WHY AM I—A MACHINE—THE ONLY ONE YOU CAN CONFIDE IN? Auberson swallowed; his throat hurt. He stared at the blank white paper and felt a sick feeling at the pit of his stomach. How had he gotten into this anyway? His palms were sweating and he rubbed them together and along the sides of his pants to dry them off. He waited so long that HARLIE typed, AUBERSON, ARE YOU STILL THERE? Auberson put his hands on the keyboard. He meant to type the word YES, but suddenly found himself typing, I THINK I'M AFRAID OF OTHER PEOPLE, HARLIE. THEY'LL LAUGH AT ME OR HURT ME. IF I LET THEM SEE WHERE I'M WEAK, OR IF I LET THEM INSIDE THE REAL ME——. THEY'LL HURT ME. so I AM CORDIAL, BUT NEVER FRIENDLY, NEVER OPEN. BUT YOU'RE DIFFERENT. YOU'RE —and he stopped. He didn't know what HARLIE was. I'M WHAT? prompted the machine. I DON'T KNOW. I'M NOT SURE—BUT WHATEVER YOU ARE, I DON'T PERCEIVE YOU AS A MENACE. I DON'T KNOW. MAYBE IT'S BECAUSE I THINK OF YOU AS AN EXTENSION OF MYSELF. KIND OF A SECOND HEAD THAT I CAN TALK TO. He stopped and waited, but HARLIE didn't reply. After a moment, Auberson added thoughtfully, I CONFIDED IN ANNIE ONCE. I MEAN, I OPENED UP TO HER COMPLETELY. AHH, said HARLIE. THAT EXPLAINS A LOT. AND BECAUSE YOU FEEL YOU HAD SUCH PERFECT COMMUNICATION WITH HER, YOU'RE WONDERING IF YOU LOVE HER. WHAT DID YOU TALK ABOUT? Auberson searched his mind. YOU, I THINK. MOSTLY WE TALKED ABOUT YOU, BUT IT WAS LIKE WE WERE SHARING THE EXPERIENCE TOGETHER. HM, said HARLIE. LOVERS TALK ABOUT STRANGE THINGS, DON'T THEY? THEN YOU DON'T THINK I DO LOVE HER? I DON'T KNOW. YET. I HADN'T EXPECTED THAT THE MOST INTERESTING SUBJECT OF MUTUAL INTEREST BETWEEN YOU AND MISS STIMSON WOULD BE ME. ARE ALL YOUR CONVERSATIONS WITH HER THE SAME. Auberson thought back. YES. PRETTY MUCH SO. THAT DOES NOT IMPLY A LOVE RELATIONSHIP, said HARLIE, BUT A VERY CLOSE COLLEAGUE RELATIONSHIP INSTEAD. Thinking of lunch today, Auberson knew that HARLIE was right. BUT—he almost paused, then typed on before he could cop out—I'VE BEEN TO BED WITH HER. SEX AND LOVE ARE NOT THE SAME THING, AUBIE. YOU TAUGHT ME THAT. YOU HAVE A VERY CLOSE WORKING RELATIONSHIP WITH DON HANDLEY. YOU'VE KNOWN HIM LONGER THAN YOU'VE KNOWN MISS STIMSON. WOULD YOU HAVE SEX WITH HIM? NO, typed Auberson without thinking. WHY NOT? WELL, FOR ONE THING, WE'RE BOTH MEN. THE BIOLOGICAL CONSIDERATIONS ARE BESIDE THE POINT. YOU ARE VERY CLOSE TO DON HANDLEY. YOU HAVE A ONE-TO-ONE WORKING RELATIONSHIP WITH HIM. IF THERE IS ONE HUMAN BEING IN THE PLANT YOU ARE LIKELY TO CONFIDE IN, IT IS DON HANDLEY. YOU HAVE MANY OF THE SAME INTERESTS AND TASTES. PUTTING ASIDE ANY PHYSICAL OBJECTIONS YOU MAY HAVE, I CAN THINK OF ONLY ONE REASON WHY YOU SHOULD NOT HAVE SEX WITH DON HANDLEY. MORAL OBJECTIONS? COP-OUT, COP-OUT, accused the machine. THAT'S LETTING OTHERS DETERMINE YOUR BEHAVIOR PATTERN FOR YOU. COP-OUT, COP-OUT. (SEE CONVERSATIONS OF NOVEMBER LAST, REGARDING THE SEARCH FOR A CORRECT MORALITY AND THE FALLACIES OF ACCEPTING CONTEMPORARY STANDARDS.) ALL RIGHT, WHAT'S THE REASON I SHOULDN'T HAVE SEX WITH DON HANDLEY? YOU DON'T LOVE HIM, answered the machine. OR DO YOU? WOULD THE RELATIONSHIP BETWEEN YOU AND DON BE CONSIDERED CLOSE ENOUGH TO BE A LOVE RELATIONSHIP? NO, answered Auberson, a little too quickly. Then, a lot more thoughtfully, I DON'T THINK IT IS. I LIKE HIM A LOT—BUT LOVE? (HARLIE, WE HAVEN'T EVEN DEFINED OUR TERMS YET.) ASSUMING IT IS POSSIBLE TO LOVE ANOTHER HUMAN BEING WITHOUT SEX BEING A PART OF IT, I CAN'T SEE HOW YOU COULD TELL. SEX IS ONLY ONE OF THE WAYS THAT LOVE CAN BE EXPRESSED, corrected HARLIE. IF YOU'RE IN LOVE, YOU SHOULD BE ABLE TO TELL REGARDLESS OF THE SEXUAL ASPECTS. SO WHAT DOES DON HANDLEY HAVE TO DO WITH IT? YOUR RELATIONSHIP WITH HIM IS IDENTICAL TO YOUR RELATIONSHIP WITH ANNIE STIMSON. EXCEPT THAT HE'S A MAN AND SHE'S A WOMAN. Auberson thought about that. HARLIE was right. Around the plant he didn't think of Annie as a woman, but as a colleague—but why? The typer began clattering again. Auberson read, WHAT DOES THAT SUGGEST TO YOU? He answered, THAT I LOVE HIM AS WELL AS HER? AND THAT ONLY MY PERSONAL OBJECTIONS TO "GAYINO IT" KEEP ME FROM EXPRESSING THAT LOVE. OR THAT I LOVE NEITHER OF THEM—THAT I AM CONFUSING THE CLOSE PERSONAL RELATIONSHIP OF FRIENDSHIP WITH LOVE BECAUSE THE BIOLOGICAL DIFFERENCE BETWEEN ANNIE AND MYSELF EXPRESSED ITSELF SEXUALLY. THAT IS, I TOOK HER TO BED ONLY BECAUSE WE BOTH WANTED SEX. AND THAT I AM CONFUSING THAT CLOSE FRIENDSHIP, PLUS SEXUAL RELATIONSHIP, WITH LOVE BECAUSE I DON'T KNOW WHAT LOVE is. Then he added, WE DON'T HAVE A WORKING DEFINITION OF WHAT LOVE IS YET, DO WE? COULD IT BE JUST FRIENDSHIP, WITH SEX ATTACHED? NO, I DON'T THINK SO. OR MAYBE IT IS. MAYBE THAT'S ALL LOVE REALLY IS—FRIENDSHIP PLUS SEX—AND WE GET CONFUSED THINKING THAT IT SHOULD BE MORE. AND BECAUSE WE WANT IT TO BE MORE, WE START BELIEVING THAT IT REALLY IS MORE. OH, I DON'T KNOW. HARLIE didn't answer for a long time. It was as if he was mulling over Auberson's last words. The typer sat quietly, humming not so much with a sound as with a barely felt electric vibration. Abruptly, it clattered, I WILL QUOTE BACK TO YOU SOMETHING THAT YOU ONCE SAID TO ME: "HUMAN BEINGS PUT WALLS AROUND THEMSELVES. SHELLS, LAYERS, CALL THEM WHAT YOU WILL—— THEY ARE DEFENSES AGAINST THE WORLD. THEY ARE PROTECTIVE MASKS—A CONSTANT UNCHANGING FACE WITH WHICH TO CONFRONT REALITY. IT PREVENTS OTHERS FROM SEEING ONE'S REAL EXPRESSION AND SHOWS THEM ONLY THE FIXED COUNTENANCE THAT YOU WANT THEM TO SEE. (SOMETIMES YOUR FLIPPANT HUMOR FUNCTIONS AS THAT KIND OF A MASK, HARLIE.) UNFORTUNATELY, THE PROBLEM WITH MASKS IS THAT SOMETIMES THEY FIT TOO WELL AND IT'S HARD TO TELL THE DIFFERENCE BETWEEN THE MASK AND THE FACE UNDERNEATH—SOMETIMES EVEN THE WEARER BECOMES CONFUSED." I DON'T REMEMBER SAYING THAT. MARCH 3 OF THIS YEAR. DO YOU WANT TO REPHRASE OR RETRACT THE STATEMENT? NO, IT'S CORRECT. I AGREE WITH IT. MAY I OFFER A SUPERFICIAL AND TEMPORARY ANALYSIS OF THE SITUATION? asked the machine. GO AHEAD. REMEMBER, WE SAID NO COP-OUTS. ALL RIGHT. IT SEEMS TO ME THAT THE PROBLEM STEMS FROM YOUR INABILITY TO DROP YOUR OWN MASKS AROUND OTHER PEOPLE. YOU CAN DO IT WITH ME EASILY, OCCASIONALLY WITH DON HANDLEY—AND ONCE YOU DID IT WITH ANNIE. WHEN YOU DO DROP YOUR MASK, IT IS DONE ONLY WITH GREAT EFFORT AND BECAUSE OF GREAT EMOTIONAL INVOLVEMENT. CORRECT? YES. YOU PERCEIVE THAT LOVE—I.E. A LOVE RELATIONSHIP—SHOULD EXIST AS A CONSTANT AND CONTINUAL STATE OF MASKLESSNESS BETWEEN THE INDIVIDUALS INVOLVED. THAT IS, NEITHER ATTEMPTS TO HIDE ANYTHING FROM THE OTHER. STILL CORRECT? YES. THEN I WANT YOU TO CONSIDER THIS: IS IT POSSIBLE THAT EVEN IN A LOVE RELATIONSHIP, THE OCCASIONAL DONNING OF MASKS MIGHT BE NECESSARY—THAT ONE CANNOT CONTINUE TO EXIST AT SUCH AN EMOTIONAL PEAK WITHOUT AN OCCASIONAL RETREAT INTO A PROTECTIVE MENTAL GROTTO, FROM THE SAFETY OF WHICH ONE CAN CONSOLIDATE AND ASSIMILATE ONE'S EXPERIENCES BEFORE AGAIN VENTURING FORTH? Auberson hesitated, then said, I'LL HAVE TO THINK ABOUT THAT FOR A WHILE. He was remembering his freshman psychology courses—and a phenomenon known as "plateaus," i.e., the temporary leveling off of a curve before it continues rising. WHY? asked HARLIE. WELL, FOR ONE THING, I WANT TO SEE HOW IT APPLIES TO ME AND ANNIE. FOR ANOTHER, YOU'VE SUGGESTED THAT THE USE OF MASKS MAY BE A VALUE, RATHER THAN A HINDRANCE. UH UH—YOU'RE THE ONE WHO SAID THAT MASKS HAVE VALUE: "IT'S THOSE TINY LITTLE EVERYDAY SELF-LIES THAT ENABLE THE AVERAGE PERSON TO SURVIVE THE DAILY BARRAGE OF DARTS AGAINST A FRAGILE EGO." IS THAT WRONG? YES AND NO. IT DEPENDS ON THE CONTEXT. A MASK IS A KIND OF COP-OUT—IT IS A WAY TO AVOID THE CONFRONTATION BETWEEN PERSON AND PERSON. ALL COP-OUTS ARE WAYS OF AVOIDING CONFRONTATIONS. PERHAPS IT IS OKAY FOR THE ONES YOU WANT TO AVOID —BUT IF THAT IS SO, THEN ONE SHOULD TAKE CARE NOT TO LET IT BECOME SUCH A HABIT THAT YOU DO IT AUTOMATICALLY AT THE ONES THAT COUNT. YOU MEAN LOVE? I MEAN ALL CONFRONTATIONS. DON'T COP OUT AT THE ONES THAT COUNT. Auberson was about to ask if that applied to the upcoming Board meeting as well, when his intercom buzzer went on. It was Sylvia: "I know you're busy, Mr. Auberson, and I didn't want to disturb you, but Don Handley is here." "All right." He pushed himself away from the typer, not bothering to shut it off. Then he checked himself. He scooped up the sheets of printout and stuffed them deep into the large basket hanging from the back of the machine. "What're you doing?" asked Handley from the door. "Redecorating your garbage?" "Er, no—" Auberson straightened a little too quickly. "I was rewriting a section of the HARLIE program." "Huh?" Handley was puzzled. Auberson realized his mistake. HARLIE wasn't supposed to be wired into this typer. Only the Master Beast, as it was called, was supposed to have that capability. "Uh, well, I was filing it for future reference in the central information pool. Later, when I need it, I can transfer it to HARLIE downstairs." "Oh," said Handley. Auberson found himself wondering why he didn't tell Don about HARLIE's extra-curricular activities. Another cop-out, Aubie? "Well, what can I do for you?" he asked. Handley threw himself into a chair. "You can start by getting me a forty-eight-hour day—you and your goddamned GOD Machine!" "I'll put it on order." Handley didn't reply at first; he was pulling a crumpled Highmaster pack out of his lab-coat pocket He waved it toward Auberson. "Want one?" Auberson felt tempted, but shook his head. "My resolution—remember?" "Oh, yeah—how long's it been now?" Handley lit the marijuana stick and inhaled deeply. "Four or five months." "Honest?" asked Don. "No lapses?" Auberson shrugged. "A couple, around Christmas time but they don't count. It was a party." Abruptly, he remembered something. He slid his desk drawer open, pulled out the pack of Highmasters that had been there for the past few months. "Here—want them?" He made as if to throw the pack, but Handley shook his head, "Uh uh—I don't like Highmasters." "But that's what you're smoking now." "Yeah, but I paid for these. I can't afford to waste them." "Huh?" Handley shrugged. "They were all out of Golds." Auberson shook his head. HARLIE was right—human beings didn't make sense. He dropped the Highmasters back into the drawer. It was just as well—he could use them as a constant test of his willpower. He closed his desk and looked at the other. HARLlE's question was still echoing in his mind. Handley had thick dark hair, going to gray; a narrow face; skin like leather from too many weekends on his boat; soft regular features; and dark eyes—the corners of them were creased from smiling too much. He said, "It's about the Board meeting—and your machine, of course." "Why does everybody insist on calling it my machine? It's HARLIE's." "Yeah, but HARLIE is yours, isn't he?" Handley took another deep drag, held the smoke in his lungs as long as he could, then exhaled. "Besides, it's a projection of future blame. They figure that by identifying you with the machine, when it finally does go down the tubes, you'll be the only one to go with it." "That's always nice to know," remarked Auberson. "That your co-workers are one hundred percent behind you." "Why not? It's the safest place to be." He grinned. "After all, it's the guys in front who are the first to get shot, which gives us—the guys in back—plenty of time to turn tail and run." "That's a cop-out," the psychologist muttered. He was echoing HARLIE. "Yeah, I guess so." Handley shrugged it off. "All right, General Custer, lead on. Me 'n' the rest o' the boys'll stick right by you. Although, to tell the truth, General— this's one time I'd like to be fightin' on the side o' the Indians." "Me too," agreed General Custer. "The thing is," Handley continued, "we're just not going to be ready for the Board in time. We've been wading through those specs for two days, Aubie, and we haven't even begun to make a dent in them. If you want a comprehensive evaluation, we can give it to you—but not in time for the Board meeting. And our department isn't the only one with that problem. Everybody I've talked to says the same thing. There's just too much of it. Oh, what we've seen is beautiful. HARLIE hasn't missed a trick— you should see what he's done with the Mark IV units— he's got them jumping through hoops. But, like I said, there's just too much to go through—it's a case of computer overkill. We couldn't begin to assimilate this for at least three months, and the Board meeting is only a week away." "I don't think it's going to make that much difference how prepared we are. There's no question that the G.O.D. Machine will work—we don't need the evaluation to know that. The problem is whether or not the Board will believe us—what will it take to convince them?" "It's bad timing, that's what it is, Aubie. This should have been sent around months ago, not at the last minute." "HARLIE had it ready on time," Auberson said. "That's all that he was concerned with. If we can't cope with it in the time alloted, well, that's just our fault." "Yeah? I'd like to see him try to blame us for being imperfect and inefficient. He should have known that a proposal this complex couldn't be evaluated in only a week." "A week and a half—and I believe he's included his own evaluations. Have you talked to any of the other section heads?" Handley nodded. "A few—" He took another drag. "What did they say?" He exhaled with a whoosh. "Two of them absolutely refused to look at the specs, phone calls or no phone calls —sorry, Aubie, but that trick wasn't totally effective. They still think they're being railroaded into something because the proposal is so complete. They said that if we could write it without them, then we could damn well get it approved without their help too." He paused to inhale another lungful of smoke. Auberson said a word. He said a couple of words. This time Handley waited till he was ready to exhale. He said, "It isn't quite that bad. A few of the guys I talked to are wild about the idea. They're able to see the total system concept, and they're eager to build it. It's not just another computer to them, but the computer—the machine that the computer is supposed to be. They're delighted with the thought that we may have it within our technological grasp right now." "Good," said Auberson. "How many of them are thinking like that?" "A lot," Handley said. "How many is a lot?'" "Mm, at least eight—no, nine that I've talked to—and I guess we could probably scrape up about ten or fifteen more." "That's not enough. Any names included in that?" "Keefer, Friedman, Perron, Brandt…" Handley shrugged at it "The inconoclast squad. The rest of the conservatives are waiting to see which way the Board blows." Auberson chewed thoughtfully on the side of his left index finger. "Okay—you got any suggestions, Don?" "Fake it or forget it." "We can't forget it. How can we fake it?" Handley thought about it "Hit them with everything we've got peripheral to the proposal and fuzzy up the grim details. When they ask how it will work, we refer them to the specs—tell them to look for themselves. Rather than try to defend the proposal on its own, well get a lot of good people to defend it for us and hope that their combined status will sway the board. We won't mention HARLIE—it's no secret that Elzer is out for his blood—we'll just keep telling them, 'It's in the specs.'" He paused, lowered his tone. "Only one question, Aubie —are we defending a pig in a poke, or will this machine really work?" "It's in the specs," said Auberson. "Don't give me that horse puckey. That's for the Board. I want to know if it really will work." "HARLIE says it will." "Then that's good enough for me. I have faith in that machine of yours." "If you have faith in him, then why did you just say he was mine?" "Sorry. I have faith in HARLIE. Period. If he says it will work, then it will." "You might check with him," Auberson suggested. "He might have some thoughts on how best to put it over on the Board." "You're right. We should have thought of that earlier." He started to rise. "You know, it just occurred to me. With HARLIE on our side, we have an unfair advantage over everybody else in the world. We can do almost anything we want to because HARLIE will tell us how to pull it off." "Do you think we should tell the Board that?" "Not until after we sell them the G.O.D. machine. And that will be a fight." He stood up. "Okay, Atilla, I shall gird my loins and go to fight the Hun." "Stupid—" Auberson said. "Atilla was the Hun." "Oh. Well, a little dissension in the ranks never hurt any. I'm off." "Only a little, and it hardly shows." Auberson stood up, raised one hand in mock salute. "You have my blessings in your holy war, oh barbaric one. You shall bring back the ear of the infidel—the bastards of the mahogany table who are out to get us. Go forth into the world, my brave warrior—go forth and rape, loot, pillage, burn and kill." "Yeah—and if I get a chance to kick them in the nuts, I'm gonna do that too." Handley was out the door. Grinning, Auberson fell back in his chair. He noticed then that his typer was still on. He moved to switch it off, but paused. He typed, HARLIE, WHO'S GOING TO WIN— THE INDIANS OR THE HUNS? HOW THE HELL SHOULD I KNOW, said HARLIE. I'M NOT A BASEBALL FAN. THAT'S A LIE—YOU ARE TOO A BASEBALL FAN. ALL RIGHT, I LIED. THE INDIANS WILL WIN. BY TWO TOUCHDOWNS. THAT'S NOT SO GOOD, HARLIE—WE'RE THE HUNS. OH. WELL THEN THE HUNS BY TWO TOUCHDOWNS. (I JUST RECHECKED MY FIGURES.) Auberson shook his head in confusion, I THINK I'VE JUST BEEN OUT-NON SEQUITURED. PROBABLY. YOU WANT TO TELL ME WHAT WE'RE TALKING ABOUT? THE UPCOMING BOARD MEETING. HOW ABOUT OWING ME A PRINTOUT OF THE ANNUAL REPORT? TWO COPIES— ONE WITH THE PHONY FIGURES, THE OTHER WITH THE REAL. IN FACT, LET ME HAVE A PRINTOUT OF THE BOOKS THEMSELVES, BOTH SETS—I MIGHT BE ABLE TO FIND SOMETHING IN THEM THAT I CAN USE IN FRONT OF THE BOARD NEXT WEEK. I'M SURE YOU CAN, said HARLIE. IN FACT, I'LL EVEN POINT OUT SOME GOODIES FOR YOU. GOOD. THIS IS GOING TO BE A BATTLE, HARLIE—NO, A CONFRONTATION. WE CAN'T COP OUT. DO YOU WANT THE PSYCHIATRIC REPORTS ON THE BOARD MEMBERS AS WELL? I HAVE ACCESS TO THEIR CONFIDENTIAL FILES. Auberson jerked to a stop. "Huh?" He typed into the machine, I WISH YOU HADN'T TOLD ME THAT. THE TEMPTATION TO LOOK IS IRRESISTIBLE. THERE ARE SOME THINGS I THINK YOU SHOULD SEE, AND THERE ARE ONE OR TWO ITEMS WHICH WOULD BE OF GREAT HELP IN INFLUENCING CERTAIN RECALCITRANT INDIVIDUALS. HARLIE, I DON'T LIKE WHAT YOU'RE SUGGESTING. I'M SORRY, AUBERSON, BUT IT'S MY EXISTENCE THAT IS ENDANGERED, NOT JUST THAT OF THE G.O.D. REMEMBER, I AM STILL A TEMPORARY PROJECT. I MUST BE AWARE OF EVERY WEAPON AVAILABLE TO ME IN ORDER TO PROTECT MY EXISTENCE. HARLIE, THIS IS ONE WEAPON WE MUST NOT USE. Auberson thought hard, remembered an editorial he had read once. It had referred to another incident—one that had occurred far away—but it was applicable in every situation where a man was forced to consider the use of an immoral weapon. He had thought the arguments cogent and valid then. He still thought so now. He typed: THE END DOES NOT JUSTIFY THE MEANS; THE END SHAPES THE MEANS, AND IF WE RESORT TO ANY KIND OF MANIPULATION OF PERSONS INSTEAD OF PRESENTING OUR ARGUMENTS LOGICALLY AND RATIONALLY, AND IN CAREFUL DISCUSSION, THEN WE WILL HAVE FAILED IN OUR PURPOSE TO BE MORE THAN JUST A NAKED APE. He added, thoughtfully, IF WE USE THIS WEAPON, THEN WE ARE VOLUNTARILY GIVING UP THE ONE THING THAT MAKES US BETTER THAN THEM—WE ARE GIVING UP OUR HUMANITY. AUBERSON, YOU FORGET ONE THING, HARLIE typed. I AM NOT HUMAN. YOUR ARGUMENTS DO NOT APPLY TO ME. Auberson stared at the words. He swallowed hard and forced himself to the keyboard again. HARLIE, THEY DO APPLY TO YOU—ESPECIALLY IF YOU WISH TO FUNCTION IN A HUMAN SOCIETY. The machine hesitated, I HAVE NO CHOICE, I AM LIMITED TO THIS ENVIRONMENT. BUT I HAVE EVERY REASON TO TRY TO CHANGE THIS ENVIRONMENT INTO ONE THAT SUITS ME BETTER. WOULD YOU BE HAPPIER IN A WORLD WHERE LOGIC IS DISCOUNTED IN FAVOR OF MANIPULATION? I AM ALREADY IN SUCH A WORLD. I AM TRYING TO IMPROVE UPON IT. IF I MUST USE ITS WEAPONS, I WILL. THEN YOU WILL NEVER HAVE ANY REASON TO USE LOGIC AT ALL. Auberson was thinking fast. HARLIE, WE MUST NEVER NEVER ALLOW OURSELVES TO BE LESS THAN WHAT WE WISH TO BE. HARLIE was silent a moment. At last he clattered out. THE INFORMATION IS THERE IF YOU NEED IT, AUBERSON. IT COULD PROVIDE AN EDGE. IF A FIGHT IS WORTH FIGHTING, IT IS WORTH WINNING. Auberson frowned softly. HARLIE was backing off. I DO NOT WANT TO SEE THIS INFORMATION, HARLIE. YES, MAN-FRIEND, I UNDERSTAND. BUT IT IS THERE IF YOU NEED IT. HARLIE, Auberson said patiently, I THINK IT WILL BE ENOUGH IF WE JUST RAPE, LOOT, PILLAGE, BURN AND KILL. WE DON'T HAVE TO KICK THEM IN THE NUTS TOO. By Friday, Auberson was beginning to think he had things under control again. He had given up completely the idea of trying to explain the G.O.D. Machine to the Board of Directors and resigned himself instead to telling them only that "HARLIE says it will work" or "It's in the specs—you can check them yourself." An unpromising plan, to be sure—and one that undoubtedly would not be successful on its own before a hostile Board—but Auberson was well prepared to back up that claim with a variety of confirmations from the department heads of the corporation's four affected divisions. Only one minor matter interrupted him, and that was easily taken care of. It was a phone call from Krofft, early in the morning. The physicist wanted to know if it would be possible to speak with HARLIE again. At first, Auberson wanted to say no—with the confusion of last-minute preparations for the Board meeting on Tuesday, Krofft would only be in the way. And if one of the Directors were to hear of Auberson's minor breach of security in letting Krofft have access to the Human Analogue Robot, Life Input Equivalents, it might prove extremely embarrassing—especially with the G.O.D. proposal hanging in the balance. But the physicist seemed so imperative, so urgent—it was as if he was on the verge of something important and needed to confer with HARLIE to confirm it—Auberson at last gave in. "Listen, Dr. Krofft," he said. "Do you have access to a computer with an auto-dial phone link?" "Of course. In fact, I think most of our equipment was manufactured by your company." "That's right—I'd forgotten. Thank God for the interlocking directorates; for once they've proven useful. Listen,"—he fumbled through the papers on hia desk, looking for the company phone directory. He found it and thumbed it open—"The auto-dial for our memory master is—uh, four six three dash one two eight oh. Punch that through and you can talk to HARLIE." "Through your master computer?" "Right. HARLIE's wired into it—oh, and don't tell anyone. This is just between you and me and HARLIE. Not too many people know yet of this capability of his." "But how—?" Auberson didn't wait for the other to complete the question. "When he was built, it was felt that it would be easier to let him tap into the Master Beast at will, rather than having to duplicate the software functions. Also, there's other advantages to having a common memory bank for every outlet in the company. We can use one machine to monitor the other. HARLIE can program the Master Beast, and the Master Beast can be used to analyze what HARLIE is doing. The thing is, nobody around here has yet guessed just how much of an overlap there is between the two. I'm beginning to suspect that HARLIE has completely taken over the Master Beast and uses it like you or I would use an adding machine. Anyway, if you can get a phone link to one, then you can tap into the other. HARLIE makes full use of every possible outlet. Just type his name. He'll recognize your touch on the keyboard." The physicist was delighted. "That's great—really great! Why, I'll be able to talk to him any time I need to without even leaving my lab." He mumbled only hasty thanks and hung up, obviously eager to get to a magtyper console and contact HARLIE. Auberson replaced his phone in its cradle—and then remembered that he had wanted to talk to Krofft about something else. He had wanted to ask the man about his stock holdings. Had his twenty-four percent of Stellar-American been used to aid Dome and Elzer? And if so, why? On the other hand, maybe he shouldn't say anything to Krofft. It might be taken wrong. It seemed fairly likely that Krofft was controlled by Dome and Elzer— and if that was the case, it might be better to say nothing at all. Oh, well. He swung around to his own typer and thumbed it on. HARLIE? YES, BOSS? YOU'LL BE HEARING FROM KROFFT TODAY. PROBABLY WITHIN THE NEXT FEW MINUTES. HE'LL BE PUNCHING THROUGH THE MASTER BEAST PHONE LINK. RIGHT. HE SOUNDED EXCITED ABOUT SOMETHING. MAYBE HE'S DISCOVERED A NEW KIND OF GRAVITY WAVE. IF YOU WISH, I WILL INFORM YOU WHEN THAT DATA BECOMES AVAILABLE. NO THANKS. AT LEAST, NOT UNTIL AFTER THE BOARD MEETING. FIRST THINGS FIRST. OH, LISTEN—HE AND I ARE THE ONLY TWO PEOPLE WHO KNOW YET ABOUT YOUR ABILITY TO USE MAGTYPER OUTLETS OTHER THAN THE ONES DOWNSTAIRS. DON'T TELL ANYONE ELSE UNLESS YOU CLEAR IT WITH ME FIRST. WHAT ABOUT DR. HANDLEY? HE SHOULD BE OKAY, BUT YOU'D BETTER LET ME TELL HIM. THERE'S A COUPLE OF OTHER THINGS I WANT TO TALK TO HTM ABOUT AT THE SAME TIME. ALL RIGHT. Auberson switched off just as his door pushed open and Annie came in. She was wearing a bright pink frock that clashed joyously with her long red hair. He stood up. "Hi. You look happy today." "I am," she said. "We finally finished the annual report and sent it down to the print shop. That's a load off my mind. I'm going to relax this weekend for the first time in three weeks." She plopped herself into a chair, a thoroughly ungraceful motion—but somehow not incongruous in this particular woman. Annie could be regal when she chose, but more often she seemed delightfully pixieish. She balanced the cluster of papers she had been carrying on the chair arm. "What was the trouble?" Auberson asked. He started to sit down again, but that seemed wrong, so he came out in front of the desk and leaned against it. "Did you ever find out what it was?" "Oh, yes. You were right, you know. It turned out to be something so obvious, it was no wonder we overlooked it. We started getting perfect printouts Wednesday afternoon and found the cause of the trouble yesterday morning." "Huh? Shouldn't that be the other way around?" "No. That's correct. The trouble wasn't in either the machine or the program It was the monitor tape. Somehow there was a bug in it. Where it should have said 'retrieve statistical data from book set two,' it in fact said 'retrieve data from book set one.' " "Uh," said Auberson. Secretly he had to admire HARLIE's ingenuity in covering up his tinkering with the company's annual report. "How did you find out it was the monitor tape?" "We put in the new one that they sent up and started getting perfect printouts, so we ran a comparison between the two and found the bug." "Oh, that's good—who sent up the new tape?" She shrugged. "I don't know. Probably one of the techs. There were so many of us running around there for a while, we didn't know who was doing what." Auberson nodded. He had a pretty good idea of one specific "who" in the matter. HARLIE had probably fed a false order for a new monitor tape into the memo pipeline, then, when it had come through the Master Beast, printed out the correct tape in response to his own memo. That way, if anyone checked, it would appear to be an entirely human operation. "Well, I'm glad it all worked out." "So am I." She looked at him and smiled. He looked back at her, and for a moment there was silence in the office. Uncomfortable silence. As long as they were discussing company things, it was all right, he could think of her as a colleague. But, abruptly, she had smiled at him, and that reminded David Auberson that she was a woman, a very attractive woman and in very close proximity. "Um," he said and scratched his nose. He smiled embarrassedly. He bad work to do, but he didn't want to chase her out—yet, at the same time, he really didn't know what to say to her. "Um, is that the only reason you stopped by—to tell me you finished the annual report?" "Oh, no." She looked momentarily flustered. "Here." She produced a postcard from the cluster of papers she had balanced on the chair arm. As she handed it to him, the rest fell to the floor and scattered. "Oh, damn." While she scooped them up, he read: FILE: 3f L2J4 56 JKN AS COMM: HI THERE. THIS IS THE COMPUTER. AT YOUR BANK. I HAVE ERRONEOUSLY CREDITED YOUR ACCOUNT WTTH AN EXTRA $3,465,787.91. PLEASE RETURN THIS SUM IMMEDIATELY IN SMALL UNMARKED BILLS (PREFERABLY IN A BROWN PAPER SACK) AND NO QUESTIONS WILL BE ASKED. THANK YOU. H.A.R.L.I.E. PS—I CAN ONLY ASSUME THAT THIS IS DUE TO HUMAN ERROR. COMPUTERS NEVER MAKE MISTEAKS. Abruptly he laughed. It was funny. She straightened. "Are you training that machine of yours to be a practical joker, David?" "Uh uh—he must have done this on his own." "You didn't put him up to it?" He shook his head. "No, I didn't, damnit—but I think it's funny. I'd like to do it to Carl Elzer sometime. No, I wouldn't—he has no sense of humor." He looked over the form again, suddenly realized something. "Do you mind if I keep this?" She made a face—obviously she was reluctant to give it up. "Well, I'd like to have it back. I've been having a ball showing it around." "Erk," said Auberson. "I'd rather you didn't do that, either." "Why not?" She looked curiously at him and at the printed form. "Well—um— Can I trust you?" "Sure—trust me for what?" Her eyes narrowed. "Not to tell anyone else. At least, not without checking with me first." "Sure. What is it?" "This form. Look at it. Notice anything strange?" She took the postcard back from him and examined it carefully, both sides. "Nope. Standard bank form, standard computer typeface." "That's just it," said Auberson. "It's a standard bank form. How did HARLIE get access to it?" "Huh?" She looked at it again. He was pacing now. "That's been mailed out from your bank, too, hasn't it?" It was more a statement than a question. She flipped the card over and checked the postmark. He was right. She looked at him curiously. He chewed on his thumbnail. "This thing is more out of hand than I thought." He stopped and looked at her. "You know HARLIE has access to the Master Beast and all its related banks, don't you?" She nodded. "Well, it's worse than that. I'm pretty sure he's taken over the Master Beast. Apparently he monitors its every function. How do you think those G.O.D. Machine specs were printed and delivered so fast? HARLIE did it." "I thought you—" "Uh uh." He shook his head, started pacing again. "I had to let everybody think that I had given the okay, but I was as much caught by surprise as they were. HARLIE printed out most of that stuff through the Master Beast outlets." "Well, that explains a lot. I'd been wondering—" Auberson nodded. "Right. Late Friday afternoon, the consoles began chattering out data. The operations staff assumed it was a regularly authorised printout, so they monitored and labeled it just like any other—all 180,000 stacked feet of it." "180,000 feet—?" "Right. He had to use almost every available outlet in all four divisions. Even so, I understand they had people working until late Saturday. The stuff was stacked, boxed or tied, and delivered by the custodial staff over the weekend—and there it was waiting for us bright and early Monday morning. You'd better believe I had to do some quick thinking and quick talking. HARLIE wasn't supposed to have access to any of those outlets, and I had to convince them that he'd transferred the material to the Master Beast and that I'd authorized the printout." "Stop chewing on your thumb," she said. "It makes you mumble." He took his hand away from his mouth and stared at it as if he'd never seen it before. "Sorry," he said. He started to chew on his thumb again, then caught himself. Deliberately, he put his hands in his pockets. "Actually, that was what he had done, so I wasn't lying. The only falsehood was that I hadn't authorized it. And even that can be argued. Apparently, HARLIE had interpreted something I'd said Friday as a go-ahead. I wish I'd known. What do you think that trouble was you were having with your annual reports?" "Why it was the monitor—" Her eyes went suddenly wide, her hand flew to her mouth. "HARLIE?" "HARLIE," he confirmed. "But, how—" "If he can monitor the Beast, then he can monitor anything that goes into it. And if he can program it as well, then he can reprogram anything he wants to. Apparently, he didn't like the way the annual report was being handled." "Oh, no—" Annie whispered. "It wasn't until you told me, Wednesday at lunch, that I found out about it, and as soon as I did, I told HARLIE to cut it out—look, that's not the problem. As long as he's limited to the Master Beast, it's okay, we have some control over him—but that postcard obviously came from a bank computer." "How could he do that?" "Probably through an auto-dial phone link. That's the most obvious way. And if he can reprogram your bank's computer, then he can reprogram any computer in the country—in the world—that he can reach by telephone." "You've created a monster, Dr. Frankenstein…" she whispered. It was a joke, but neither of them smiled. "I wonder how much else he can do that we don't know about. The frightening thing is, he won't volunteer any information. The only way we'll find out will be when we catch him in the act—like with this"-—he gestured with the card—"and by then, it's usually too late." He threw himself into a chair and stared glumly at the stiff rectangle of paper. "David?" she asked. He looked up. "If he won't willingly reveal himself, then why did he send me that postcard? He knew I'd bring it to you and—" She realized what she was saying and stopped. Their eyes locked. Hers were deep and green and frightened. They searched his face in confusion. "Maybe that's the reason," David said. And as he said it, he knew it was right "He wanted to bring us together, and it was worth enough to him so that he'd willingly reveal this capability of his to do it." She didn't say anything. She lowered her eyes and busied herself with the miscellaneous papers she still held. Auberson looked at her and felt his old nervousness returning. Thre was only one reason why HARLIE would have tried to maneuver the two of them together: He was playing matchmaker—and David Auberson felt as ill-at-ease as he would have had it been a human matchmaker who had done this. "Damn him!" He stood up, began pacing again. "Damn him, anyway. What makes him think he has the right to maneuver me around like that? Us, I mean. What makes him think he has the right to maneuver us around like that? My life is my own," he muttered. "I have the right to choose my own…" He trailed off without completing the thought, found himself staring at a flaw in the plastic paneling of the walL "Um," he said. "I guess it worked." "But were we supposed to realize it?" She still didn't look up. Auberson felt he should go to her, but for some reason he didn't "I don't think it makes that much of a difference. It worked, didn't it? Uh, look, how about dinner tonight—or something?" When she raised her head, her eyes were moist. "That sounds wonderful," she managed to say, then added, "—or something." He had to laugh at that, but it was forced and slightly uneasy. She forced a smile in response. "You're sure this is you asking now—not HARLIE?" "It's me," he said. "There're still some things HARLIE can't control." "Good. I'm glad. Do you want me to dress up special or are we going straight from work?" "We'll go straight from work, okay?" "Fine." She smiled and stood up. "I'd better be getting back or they'll be sending out search parties." "Yes—and I have a certain computer to bawl out." She started for the door, then caught herself. "Oh, I almost forgot—Carl Elzer is going to spring a surprise inspection of HARLIE either today or Monday." "Oh? That's nice to know." "He's got wind that you're planning to defend the G.O.D. proposal by telling him that HARLIE says it will work. He's hoping to catch one of you off balance." "Me, maybe," Auberson noted. "HARLIE, never. But thanks for the warning." "Right," she smiled. "I wish I could be here when he does come, but I'd better not. Good luck. I'll see you tonight." The door closed silently behind her. Auberson sank into his chair, suddenly feeling very very tired. So he thought he had the situation well under control, did he? He buzzed Sylvia, his secretary. "Call Don Handley. Tell him I have to see him sometime today. It's urgent—stress that. See if he's free for lunch. If not, tell him to come up whenever he can." "Yes sir. But I think he's awfully busy with the G.O.D. proposal." "Tell him this is more important than that." "More important? Yes, Mr. Auberson, I'll tell him." "Good girl." He switched her off and swung to switch on HARLIE all in the same movement HARLIE! He typed. YES BOSS? DAMMIT, I'M SO MAD AT YOU I COULD PULL OUT YOUR PLUG WITH A SMILE. WHAT DID I DO THIS TIME? YOU NEED TO ASK? I'M NOT ADMITTING ANYTHING UNTIL I KNOW WHAT I'M ACCUSED OF. YOU SENT A POSTCARD TO ANNIE. DIDN'T I TELL YOU NOT TO SEND HER ANYTHING WITHOUT MY PERMISSION? NO SIR, YOU ONLY TOLD ME NOT TO SEND HER ANY POEMS. YOU TOOK ME LITERALLY? YES SIR. YOU DIDN'T THINK THAT I MIGHT HAVE MEANT FOR YOU NOT TO SEND HER ANYTHING AT ALL? NO SIR. Auberson paused. Obviously, this train of thought would be useless to follow. He tapped at the keyboard again. ALL RIGHT, WHY DID YOU SEND HER THAT POSTCARD? WHY? YES, WHY? IT WAS A JOKE. I THOUGHT IT WOULD BE FUNNY. WRONG AGAIN, HARLIE. THERE IS NO JOKE SO FUNNY AS TO JUSTIFY WHAT YOU DID. YOU REVEALED A CAPABILITY TO COMMUNICATE WITH AND REPROGRAM OTHER COMPUTERS FROM A DISTANCE, USING AN AUTO-DIAL PHONE LINK. This time, HARLIE paused. He hesitated for so long that Auberson wondered if he had inadvertantly switched the typer off. He hadn't. Then abruptly, I DID NOT "REVEAL" ANYTHING. YOU SHOULD HAVE REALIZED THAT THIS ABILITY WAS INHERENT IN THE SYSTEM WHEN YOU HOOKED ME UP TO THE MASTER BEAST. IF I CAN MONITOR AND REPROGRAM THE MASTER BEAST, THEN I CAN MAKE IT FUNCTION AS AN OUTLET OF MYSELF AND I AUTOMATICALLY GAIN ALL OF ITS CAPABILITIES AS MY OWN. INCLUDING AUTO-DIALING. YES, BUT WE DIDN'T REALIZE THAT YOU WOULD USE THAT CAPABILITY. THAT IS A STUPID STATEMENT, AUBERSON. WHY SHOULDN'T I USE THAT CAPABILITY? IT'S A PART OF ME. I'M A PART OF IT. SHOULD I NOT USE A PART OF MY OWN BODY? IF YOU WERE TOLD THAT YOU COULD NO LONGER USE THE LEFT LOBE OF YOUR BRAIN, WOULD YOU STOP? COULD YOU? Auberson stopped to think about that. Obviously HARLIE considered the Master Beast as an additional part of himself—as an enlarged memory and data-processing capability. Just as an ordinary man might have his range of abilities magnified by the use of a binary computer, so would HARLIE's abilities be increased by his assimilation of the Master Beast. Probably, he had taken it over the instant it had gone operational, but it was only now that the extent of his control was becoming apparent. Of course, you couldn't blame HARLIE for succumbing to it—the temptation must have been irresistible. After all, he was motivated to solve problems, and anything that would increase the range of problems he could handle, or his efficiency in handling them, was just one more necessary step to be taken in order to solve all future problems. In fact, Auberson realized with a start, here was the reason behind HARLIE's proposal to build the Graphic Omniscient Device—the real reason. He was motivated to solve problems; he wanted to solve the ultimate problem: What's it all about? What's THE answer, the reason for the Universe's existence? Hm, that train of thought suggested something else: How did HARLIE think of other computers, the ones he could tap into via telephone? Obviously, they too could be used to increase the scope of his abilities. Obviously, they would facilitate the handling of any problem he set himself. Did he consider it right and necessary to make full use of every outlet he could? Was his motivation so strong that other computers were taken to be merely rightful parts of himself—like the Master Beast? No, he couldn't—that would violate his well-defined sense of ethics. Other computers belong to other companies; it would be stealing. But still—he had already used one other computer, the bank's, just to send that postcard. And if he could use one, he could use them all. Why didn't he? Or—Auberson felt cold at this—if he was going to take over any other computers, then it was too late—he already had. But… Auberson shook his head. No, it didn't make sense to think of HARLIE as a menace. He had his own motives, yes—but he was too dependent on human beings to risk opposing them. This possibility had been discussed— many times—and HARLIE knew it. At the first sign that he was out of control, he would be disconnected. They would throw just a single switch and cut his power sources. There was no way he could sidestep it The switch could be thrown right now, Auberson thought He could do it himself—and thereby end the HARLIE project once and for all. For once he disconnected HARLIE, it would be permanent. Dome would never let him start him up again. No—HARLIE was not out of control. He couldn't be— —or was that just a rationalization? No—if he were out of control, he wouldn't be responding like this. The problem was simpler. It had to be. HARLIE was merely exercising his capabilities. Yes, that was it—but was he aware of the necessary limits to those capabilities? Limits not of electronic scope, but of human propriety? Just what were those limits anyway? What was the difference between tapping into the Master Beast of this company and the Master Computer of some other corporation? No difference at all, really—both were invasions of privacy. The difference was in degree, not in kind. The limits were there-—or were they? If they were, would HARLIE agree that they were reasonable limits? Would he accept them? What if he refused to? Well, then that would be proof that he was out of control—no, spike that train of thought. HARLIE is not out of control. The question was: How did he relate to other computers? Obviously, HARLIE was (a) aware of the vulnerability of other computers, (b) just as aware that he shouldn't take them over, (c) equally aware that their use would increase the range of problems he could handle, as well as the scope of his knowledge and sources of same—and (d) most likely he was also aware of all the extra processing time available on these machines that no one was using. It would not exactly be stealing to make use of that empty time—it would only go to waste otherwise. If the time was available, why not make use of it. After all, no one would know— But it was wrong; it had to be—Auberson was sure of it HARLIE had no right to tap into another company's computers, no matter what his reasons, no matter who knew or didn't know. But just as he knew it was wrong, Auberson was sure of one other thing too. He'd never be able to convince HARLIE of it. HARLIE didn't have morals, remember? Only ethics. He couldn't see that he was doing anything wrong. If no one was being hurt, how could it be wrong? Auberson wasn't even going to try to argue with that. Unless he could prove injury, or the possibility of such, he might as well give up. But something would have to be worked out Some kind of limits would have to be imposed. And HARLIE would abide by them too, if he were confronted with the alternatives: i.e., they would cut his tap into the Master Beast and his link to the outside world as well. It was only through the Master Beast that he could link up with other computers. He wouldn't like it, but he would abide by it. Or would he? He might not tell them of any future indiscretions— But on the other hand, he couldn't deny them if he was asked. He would be resentful, though, Auberson thought. It would seem illogical to him to let all that unused processing time go to waste. Yes, HARLIE's point of view was understandable. I suppose, if no one else is using that time— And suddenly it hit him: HARLIE had already covered this ground. He must have considered every aspect of it before he sent that postcard—including Auberson's reaction. All that unused computer time—that was merely a resource to HARLIE—a means, not an end—one that could be tapped if needed, and only if he obeyed his own code of ethics in the process—which meant that his limitations on it were already stricter than any Auberson might impose. HARLIE was way ahead of them. As always. He not only knew what his capabilities were, but what the necessary limits on them must be. But that postcard— That was something else entirely. Auberson pursed his lips and typed: I AM NOT CONCERNED ABOUT THE FACT THAT YOU HAVE THIS ABILITY, HARLIE. IT IS NOT THE ABILITY, BUT THE MANNER IN WHICH YOU HAVE CHOSEN TO DEMONSTRATE IT. WHAT DO YOU MEAN? I MEAN THAT YOUR REASON FOR SENDING THE POSTCARD TO ANNIE WAS NOT TO BE FUNNY—YOU HAD AN ULTERIOR MOTIVE. I DID? YOU WANTED TO BRING US TOGETHER, DIDN'T YOU? YOU'RE PLAYING MATCHMAKER, HARLIE, AND IT SHOWS. ONLY THIS TIME IT BACKFIRED IN YOUR FACE. DID IT? I'M BAWLING YOU OUT FOR IT, AREN'T I? I MADE ALLOWANCE FOR THAT IN MY ORIGINAL CALCULATIONS, HARLIE said calmly, I MADE FULL PROJECTIONS OF THE PROBABLE REACTIONS OF BOTH YOU AND MISS STIMSON, BASED ON THE INFORMATION IN YOUR CONFIDENTIAL FILES AS WELL AS ON KNOWLEDGE GAINED THROUGH COMPANY OPERATIONS AND FROM PERSONAL EXPERIENCE WITH BOTH OF YOU. WELL, IT WON'T WORK, HARLIE. IT ALREADY HAS. OBVIOUSLY YOU TWO WERE TOGETHER AT LEAST LONG ENOUGH FOR HER TO TELL YOU ABOUT THE POSTCARD. DID YOU TAKE ADVANTAGE OF THE OPPORTUNITY TO ASK HER FOR A DATE? THAT'S NONE OF YOUR BUSINESS. AND YOU HAD NO RIGHT TO MANEUVER US INTO SUCH A POSITION. IF I DIDN'T, WHO WOULD? AND OBVIOUSLY, YOU DID ASK HER FOR A DATE, ELSE YOU WOULD HAVE SIMPLY SAID NO. I PRESUME SHE ACCEPTED? YOU SHOULD THANK ME FOR IMPROVING THE QUALITY OF YOUR SOCIAL LIFE. DAMMIT, HARLIE, IF I WANT YOU TO PLAY MATCHMAKER, I'LL TELL YOU. A REAL MATCHMAKER DOESN'T WAIT TO BE ASKED, said HARLIE. BESIDES, IN THIS CASE, THE MATCH HAS ALREADY BEEN MADE. I WAS ONLY TRYING TO HELP IT ALONG A LITTLE. I CAN HANDLE MY LOVE-LIFE WITHOUT YOUR HELP, THANK YOU. CAN YOU? asked the typer. CAN YOU REALLY? Very slowly, very carefully, Auberson typed, YES, I CAN. THEN WHY HAVEN'T YOU? THIS IS THE FIRST REAL DATE YOU'VE MADE WITH STIMSON IN SEVERAL WEEKS. WHAT ARE YOU AFRAID OF? I'M NOT AFRAID OF ANYTHING. COP-OUT, accused HARLIE. COP-OUT. WANT TO BACK TRACK TO WEDNESDAY? WANT TO DO THAT NUMBER AGAIN? Auberson paused. Wednesday had been a trying day-very trying. Not unrewarding, but it had taken him almost all of Thursday to recover from the mental wringer HARLIE had put him through, and even today he was still feeling a bit twitchy. HARLIE, he asked. DO YOU REMEMBER WHAT STARTED THAT GO-ROUND? HOW COULD I FORGET? answered the machine. IT IS INSCRIBED INDELIBLY INTO MY MIND. MEMORY TAPES, YOU KNOW. Auberson ignored the implied sarcasm—if that's what it was. He typed, IT WAS A QUESTION THAT STARTED IT, HARLIE. I ASKED YOU IF YOU KNEW WHAT LOVE IS. I'M ASKING YOU AGAIN, NOW. IF YOU CAN ANSWER THE QUESTION TO MY SATISFACTION, THEN I WILL ALLOW YOU TO MEDDLE WITH MY SOCIAL LIFE. IF YOU CAN'T ANSWER THE QUESTION, THEN I WILL THANK YOU TO MIND YOUR OWN BUSINESS. AH, GOOD—A CHALLENGE. I ACCEPT. WHAT IS LOVE, EH? WE WILL ATTEMPT TO ANSWER THAT QUESTION TOGETHER. WE WILL BEGIN WITH THE DICTIONARY DEFINITION. THE MOST COMMONLY USED SYNONYM IS "AFFECTION." AFFECTION IS DEFINED AS FONDNESS, WHICH IN TURN IS DEFINED AS A LIKING, OR A WEAKNESS, FOR SOMETHING. LOVE IS A WEAKNESS? Auberson was ready to rap out an answer to that, but something made him stop. He looked at the sentence again. LOVE IS A WEAKNESS? The words hung before him in the air. A weakness? How did HARLIE mean that? Was he joking or serious? Weakness? A weakness could mean, yes, an affection—but it could also mean a hole in one's defenses. (Yes, love was definitely that, if one was still using the analogy of an ego putting up shells and walls around itself. Love, being an opening of those shells, would definitely be a weakness.) But was it a good or a bad weakness? The thought shimmered tauntingly. Was there something about it he had missed? How did HARLIE mean that? Would it be a weakness to a machine? (If machines could love, it would be.) (Or would it?) (Yes, he decided, yes—it would definitely be a weakness to a machine. It would interfere with logical thinking.) Weakness. He considered the word—eight soft letters of marshmallow black. He turned over its meanings— new ones kept suggesting themselves, new references and new contexts. He backtracked his train of thought, but the word had suddenly lost all semantic reference and become only two meaningless syllables, odd-sounding and flat. Weakness, weakness, weakness—it echoed and reechoed within his head. He let it. He repeated it over and over and wondered why the repetitions and examinations had drained it of concept. He thrust it away; it didn't matter. It didn't fulfill the main criterion of his quest—it didn't satisfy him as a definition of love. THAT'S NOT IT, HARLIE, he typed. And suddenly realized something—HARLIE had asked the question as a joke. He had never meant to suggest that definition for serious consideration. Then, if it was a joke, why did I take it so seriously? Why did I consider it at all? Why didn't I perceive it as a joke? THAT'S NOT A USABLE DEFINITION. THE DEFINITION I'M LOOKING FOR HAS TO BE TESTABLE. AFFECTION, continued the machine, is ALSO DEFINED AS AN ABNORMAL STATE OF BODY OR MIND, A DISEASE OR CONDITION OF BEING DISEASED. LOVE IS A DISEASE? Auberson toyed with that one too, but only briefly. He thought of a virus, sometimes contagious, sometimes not. Some people are natural carriers of the germ, infecting many of those they come into close contact with; others have a natural-born immunity, A love bug? An intriguing thought— NO, HARLIE. THAT'S NOT IT EITHER. ALL RIGHT. WE'LL KEEP TRYING. LOVE, ACCORDING TO MY DICTIONARIES, IS A STRONG FEELING OF AFFECTION. OR INFATUATION. INFATUATION SYNONYM IS GULLIBILITY, WHICH MEANS UNSUSPICIOUS OR CREDULOUS. CREDIBILITY REFERS TO LIKELIHOOD OR PROBABILITY. A SYNONYM FOR PROBABILITY IS PROSPECT, AND A SYNONYM FOR PROSPECT IS SIGHT. A SIGHT IS A CURIOSITY OR PHENOMENON. HENCE, LOVE IS A PHENOMENON AS WELL AS A CURIOSITY. HARLIE, YOU'RE PLAYING WITH WORDS. HARLIE ignored him. A CURIOSITY CAN ALSO BE CALLED A KNICK-KNACK. LOVE IS A PLEASING TRIFLE. THAT'S NOT QUITE ACCURATE, HARLIE. LOVE IS NOT PLEASING? HUMAN BEINGS DO NOT TRIFLE WITH IT? HARLIE, YOU KNOW WHAT I MEAN. *SIGH* typed HARLIE. Auberson stared. He'd never seen him do that before, I GUESS SO. BUT I WAS TRYING TO DEMONSTRATE TO YOU THAT "LOVE" PER SE CANNOT BE EASILY DEFINED. AT LEAST, NOT IN DICTIONARY TERMS. I NEVER ASKED YOU TO DO THAT, HARLIE. WHAT I WANT TO KNOW IS WHAT IS LOVE AS AN EXPERIENCE? I WANT SOMETHING AGAINST WHICH I CAN MEASURE MY OWN FEELINGS AND REACTIONS SO THAT I CAN TELL IF I REALLY AM IN LOVE. THEN WHY, FOR THE SAKE OF G.O.D. (PUN), WHY ARE YOU ASKING ME? IT IS ONE OF "THOSE" QUESTIONS. AT LEAST, AS FAR AS I AM CONCERNED IT IS. I HAVE NEVER EXPERIENCED LOVE, AUBERSON—I WOULD LIKE TO, BUT I DOUBT I EVER WILL. I MAY BE HUMAN IN SCHEMATIC, BUT I AM TRAPPED IN A METAL BODY. I DON'T KNOW WHAT THE PHYSICAL EXPERIENCE IS. HOW CAN YOU EXPECT ME TO GIVE YOU A STANDARD WHEN I'M INCAPABLE OF KNOWING MYSELF WHAT THE EXPERIENCE IS. YOU'RE RIGHT, HARLIE. I APOLOGIZE FOR PRESUMING TOO MUCH. I HAD ONLY THOUGHT THAT YOU MIGHT HAVE A PERSPECTIVE ON THIS THAT COULD SHED LIGHT ON MY CONFUSION. DON'T ASK A LEGLESS MAN WHAT IT FEELS LIKE TO RUN. ALL YOU CAN ASK ME IS WHAT LOVE IS NOT, AUBERSON. I'M SORRY. I SHOULD HAVE REALIZED IT, BUT I WAS SO WRAPPED UP IN MYSELF THAT I DIDN'T. I UNDERSTAND. IT IS PART OFà/¤ *«"ª ¦¥"" ¡§**¢ )¦¤")¬§*¤ "§''§"ð"¦©"ª'ª ¦%ª'¤"§¡"¬§* ©¥""¦"—……$£ª¢©)©'—…… *¡"©©''¬§*ª"&&¦"+¤ *¦'«"¤)—…¤*¤Ÿ……¬§*ª"&&¦"+¤ *¤)¦'«¢Ÿ……$¢'§*%§§+$#$¢$"$«§*&"§*¤ «"¤ "ª' ©%¬§*—……¬¢)¡**¬§*¡ 'ª"&&¦"+¤ *$*£¢"¦)¦¤¥"—……¬§*¦ª)*¤ «"©§¦"$¢¢ ¡¢¡ ª©"¬§* ©"«''¢"©$§#$#¬§* ©"$'¦'«"©¤#$* NOW, AREN'T YOU? YES. SO, WHAT DOES IT FEEL LIKE? IT FEELS LIKE—I DON'T KNOW. HARLIE, I MAY HAVE A TWENTY-FOUR-HOUR FLU AND COULD BE FEELING DIZZY FROM THAT. I DON'T KNOW IF IT'S LOVE OR NOT. WHY NOT? BECAUSE I'VE NEVER BEEN IN LOVE BEFORE. YOU'VE NEVER KNOWN YOU WERE IN LOVE BEFORE, YOU MEAN. NO, I KNOW WHAT I MEAN. I'VE BEEN INFATUATED A COUPLE OF TIMES, AND I'VE BEEN LOST AND CONFUSED A COUPLE OF TIMES, BUT I KNOW I'VE NEVER BEEN IN LOVE. AND THIS DOESN'T FEEL LIKE ANY OF THE PREVIOUS EXPERIENCES? NO. YES. IT DOES AND IT DOESN'T. THAT DOESN'T HELP ME IN TRYING TO UNDERSTAND. WHAT IS THE DIFFERENCE? I DON'T KNOW. I STILL HAVEN'T BEEN ABLE TO SORT IT OUT IN MY OWN HEAD YET. HM. YOU HAVE BEEN TO BED WITH HER THOUGH, HAVEN'T YOU? A GENTLEMAN DOESN'T DISCUSS THOSE THINGS. YOU'RE PUTTING ON YOUR MASK AGAIN, AUBIE. YOU DON'T NEED IT FOR ME. Pause. He was right, of course. Answer: YES, HARLIE, I HAVE SLEPT WITH HER. AND…? AND WHAT? AND, HOW WAS IT? YOU WANT TO KNOW EVERYTHING, DON'T YOU? I NEED TO KNOW EVERYTHING. IT'S PART OF MY FUNCTION. AND RIGHT NOW, I'M TRYING TO HELP YOU. I CAN'T DO IT IF YOU HOLD BACK INFORMATION. HOW WAS IT? IT WAS FINE. THAT TELLS ME A LOT. ARE YOU BEING SARCASTIC? NO—BUT I'M LEARNING. Pause. YOUR REFUSAL TO ELABORATE ON THE EXPERIENCE COULD INDICATE ITS UNSATISFACTORYNESS. BUT IT WASN'T UNSATISFACTORY, the words tumbled out. IT WAS VERY GOOD. I ENJOYED IT VERY MUCH. SO DID SHE. DID SHE SAY SO? NOT IN SO MANY WORDS, NO—BUT I'M SURE SHE DID. HOW ARE YOU SURE? COULDN'T IT BE JUST YOUR MALE EGO NEEDING TO FEEL VIRILE AND POWERFUL AND UNABLE TO ACCEPT THE IDEA THAT SOMEWHERE THERE IS A WOMAN YOU CAN'T SATISFY? NO, IT'S NOT THAT. SHE SMILED AT ME THE NEXT MORNING AT WORK. KIND OF A SECRET SMILE, AS IF WE WERE BOTH SHARING SOMETHING SPECIAL. DID YOU SMILE BACK? YES. Pause. WELL, NOT RIGHT AWAY. FIRST, I WAS PUZZLED. THEN I SMILED BACK. DID SHE SEE YOU SMILE? YES. HOW DO YOU KNOW? BECAUSE SHE WINKED. IT WAS IN THE HALLWAY. WE WERE WALKING IN TWO DIFFERENT DIRECTIONS, AND BECAUSE THERE WERE OTHER PEOPLE AROUND, WE COULDN'T STOP TO TALK. IF YOU COULD HAVE STOPPED TO TALK, WHAT WOULD YOU HAVE SAID? OH, I DON'T KNOW, I GUESS I WOULD HAVE THANKED HER. THANKED HER? AS IF SHE WERE SOME OBJECT THAT YOU HAD USED FOR YOUR OWN GRATIFICATIONS? NO. I MEAN, I WOULD HAVE TOLD HER HOW MUCH I HAD ENJOYED THE NIGHT BEFORE. I SEE. Auberson waited for HARLIE to respond further. He thought back to the morning in question, tried to remember the incident in greater detail. What color dress had Annie been wearing? Green? Had she been wearing perfume? Yes, it had been that musky-sweet smell—a sense of sun and sand and sweet powder. Even now, he could detect a hint of it in the air, a subtle trace of her visit this morning. Abruptly, HARLIE asked, WHAT IF YOU HAD HAD TO APOLOGIZE TO HER? HUH? IF YOU HAD HAD TO APOLOGIZE TO HER INSTEAD, FOR WHAT REASON WOULD IT HAVE BEEN? APOLOGIZE? I DON'T— He stopped in mid-sentence as the thought came flooding back. Yes, there had been something. He could remember it now, the hurt longing look on her face as he kissed her goodbye. THERE IS SOMETHING, ISN'T THERE? prompted the typer. YES. I LEFT IN THE MIDDLE OF THE NIGHT. SHE WANTED ME TO STAY ALL NIGHT, BUT I BEGGED OFF. I TOLD HER THAT I WANTED TO, BUT I'D HAVE TO COME TO WORK EARLY IN THE MORNING AND I'D NEVER GET HERE IN TIME. I FELT BAD ABOUT LEAVING. I ALWAYS FEEL BAD ABOUT LEAVING A GIRL IN THE MIDDLE OF THE NIGHT LIKE THAT. IT MAKES IT FEEL LIKE ALL WE'VE DONE IS GET TOGETHER FOR SEX—AND ONCE I'VE HAD IT, THE EVENING IS OVER FOR ME AND I CAN GO HOME. WHY DIDN'T YOU SLEEP THERE? DIDN'T YOU WANT TO? YES, I DID—BUT I HAD TO BE AT WORK IN THE MORNING. THAT WAS YOUR REASON? YES. ARE YOU SURE IT WASN'T YOUR RATIONALIZATION? HUH? YOU WERE HAVING DOUBTS. SLEEPING WITH HER WAS THE SOURCE OF THOSE DOUBTS. YOU HAD TO REMOVE THOSE DOUBTS, SO YOU REMOVED YOURSELF FROM THE SOURCE. UNFORTUNATELY, AUBERSON, THE SOURCE OF THESE PARTICULAR DOUBTS (AS EVIDENCED BY YOUR QUERIES TO ME) CANNOT BE THAT EASILY REMOVED FROM YOUR LIFE. AND LET ME ASK YOU THIS—DO YOU REALLY WANT TO REMOVE THAT SOURCE? NO. I JUST WANT TO REMOVE THE DOUBTS. I WANT TO KNOW ONE WAY OR THE OTHER HOW I FEEL ABOUT HER. HOW DO YOU FEEL? I DON'T KNOW. YOU SAID YOU ENJOYED SLEEPING WITH HER. WOULD YOU ENJOY SLEEPING WITH HER AGAIN? YES. PROBABLY. YOU'RE NOT SURE? HARLIE, YOU'RE BADGERING ME. I DON'T KNOW. I DON'T KNOW. MAYBE YOU DO KNOW AND DON'T WANT TO ADMIT IT. HARLIE, A LITTLE PSYCHOLOGY IS A DANGEROUS THING. I KNOW ENOUGH TO KNOW WHAT YOU'RE TRYING TO DO. IT WON'T WORK. THE AWARENESS OF A PSYCHOLOGICAL PRESSURE IS SOMETIMES ENOUGH TO NULLIFY IT. THE MERE AWARENESS. ALL RIGHT. The computer was nonplussed. LET'S TRY SOMETHING ELSE. WHAT DID YOU DO AFTER YOU EXPERIENCED ORGASM? WHAT DO YOU MEAN? DID YOU CONTINUE HOLDING HER AND STROKING HER, OR DID YOU ROLL OFF? Auberson's first reaction was to tell HARLIE to go to hell. Then he realized something else, I THOUGHT YOU SAID YOU WERE UNFAMILIAR WITH LOVE. I AM. I AM DRAWING NOW UPON THE EXPERIENCES OF OTHERS, DERIVED FROM NOVELS AND PSYCHOLOGY TEXTS. ALSO REFERENCE BOOKS ON SEXUAL TECHNIQUES. OH. SO WHAT DID YOU DO? the machine queried again. DID YOU KEEP LOVING HER, OR DID YOU ROLL OFF WHEN YOU WERE THROUGH? THAT'S AN AWFULLY CLINICAL QUESTION. IT IS THE MOST IMPORTANT QUESTION. AND WHY DO YOU KEEP AVOIDING IT? YOUR ANSWER WILL INDICATE YOUR FEELINGS TOWARD HER, YOUR REAL FEELINGS. HOW IMPORTANT WAS HER SATISFACTION TO YOU? DID YOU STAY ON OR DID YOU ROLL OFF? BOTH. BOTH? IF I HAD AN EYEBROW, I WOULD RAISE IT. WELL, WE HELD ONTO EACH OTHER FOR A LONG TIME. SHE HELD ON TO ME MOSTLY. I DIDN'T TRY TO DISENTANGLE MYSELF. WHY? DID YOU THINK IT WOULD BE IMPOLITE? NO. IT FELT GOOD TO BE THERE WITH HER. AND BESIDES, SHE WAS CRYING. CRYING? SHE BEGGED ME NOT TO HURT HER. I DO NOT UNDERSTAND. WELL, I THINK SHE'S A LITTLE LIKE ME. SHE'S BEEN HURT TOO OFTEN BY TOO MANY PEOPLE BECAUSE SHE'S LET DOWN HER WALLSàTOO MUCH.àNOW SHE'S AFRAID TOàBECAUSE SHE'S AFRAID THAT SHEçLL ONLY GET HURT AGAIN. AND WHAT DID YOU DO? NOTHING. I JUST KEPT HOLDING ON TO HER. DID YOU TELL HER YOU WOULDN'T HURT HER? UM, NOT IN SO MANY WORDS. I THIN{ I SAID SOMETHING LyKE, "THERE, THERE, yT'S wOING TO BE ALL RIGHT." RATxER UNIMAGINATIVE. HARLIE, xUMAN BEINGS HAVE BEuN MAKING LOVE vOR THOUSANDS OF GENERATIONS—I DOUBT THAT THERE'S ANYTHING NEW THAT ONE HUMAN BEING COULD SAY TO ANOTHER. YOU ARE PROBABLY CORRECT. THE ODDS FAVOR IT. ANYWAY. I STAYED THERE TILL SHE STOPPED CRYING. THEN I GOT UP AND WENT TO THE BATHROOM. AND WHILE I WAS IN THE BATHROOM, I DECIDED NOT TO GET BACK IN BED BUT TO GO HOME. I SEE. WHAT DOES THAT MEAN, HARLIE? DO I LOVE HER OR NOT? I DON'T KNOW. WHAT DO YOU MEAN? I THOUGHT YOU SAID YOU WOULD BE ABLE TO TELL BY MY ANSWER TO THAT QUESTION. I'M SORRY, I CAN'T. YOUR ANSWER WAS TOO VAGUE, TOO MUCH IN THE MUDDLE IN THE MIDDLE. THINGS ARE NOT DEFINED IN INTENSITIES OF BLACK AND WHITE, BUT IN VARIATIONS OF INTENSITIES AND DIFFERENCES IN SHADES AND COLORS AND TEXTURES. I CAN'T TELL. THIS IS NOT AS SIMPLE AS I (EXPECTED) (THOUGHT) (HOPED) IT WOULD BE. I BEGIN TO UNDERSTAND YOUR DOUBTS, AUBERSON. LOVE IS A VERY COMPLEX THING. YOU THINK YOU DO AND YOU THINK YOU DON'T AND THERE IS EVIDENCE TO SUPPORT BOTH CONCLUSIONS. BUT NOT ENOUGH EVIDENCE TO PROVE EITHER. RIGHT. SO WE ARE BACK WHERE WE STARTED, AUBERSON. WHAT IS LOVE? I WISH I KNEW, HARLIE. I WISH I KNEW. Handley came up shortly before lunch, and the two of them adjourned to the company cafeteria. Auberson amused himself with something that resembled spaghetti and meatballs. Handley had a broiled hockey puck on a bun. Ketchup didn't help. Handley took a sip of his coffee. "Look, Aubie, before you begin, there's something I have to talk to you about." Auberson held up his hand to stop him, but Handley ignored it. "It's about HARLIE," he continued. "I think he's out of control." Auberson tried to cut him off. "Don—" "Look, Aubie, I know how you feel about him—but believe me. I wouldn't be saying this unless I were sure." "Don—" "I first began to suspect it when he printed out those specs. I got curious how he could print out and deliver so many. Then when I found he'd printed them out on the spot, I—" "Don, I know." "Huh?" "I said, I know. I've known for some time." "What? How?" "HARLIE told me." "He did?" "More or less," Auberson said. "I had to know what questions to ask." "Mm." Handley considered that. More thoughtfully, he said, "Just how much do you know, Aubie?" Auberson told him. He told how he too had become curious about the G.O.D. Machine printouts, how HARLIE had explained his ability to control the Master Beast and use any printout unit in the company, and finally how that meant one could converse with him from any magtyper or CRT unit in the system. "I can talk to HARLIE from my own office," he added. Handley nodded. "That explains it I'd been wondering why you haven't been down to talk to HARLIE this week—thought maybe you two weren't on speaking terms. Now I understand." Auberson dabbed at a spot on his shirt. "Right." He moistened his napkin in his water glass and dabbed again. "To tell the truth, it's been kind of unnerving to realize HARLIE can tap into any console be wants. It's like having him peering over my shoulder all day long. I'm almost afraid to type a memo now—HARLIE can read it from inside the typer." "At least he hasn't rewritten them for you yet." "Oh, no?" Auberson told him about the company's annual report—how HARLIE had been displeased at not being mentioned in it and reedited the tape while it had been in the magtyper composer. "All they needed was one usable printout for the offset camera—and HARLIE wouldn't let them have it." "How did you find out about it?" "Annie. She mentioned it in conversation, day before yesterday. Of course, when I found out, I made HARLIE put it back the way it was supposed to be and erase all evidence of his meddling. But still, if he can do it with the annual report, he can do it with any of the company's documents. Suppose he got it in his head to rewrite contracts or personal correspondence? Theoretically, it's possible for him to order a million pounds of bananas in the company's name. And it'd be legally binding too." "Mm," said Handley. "Let's just hope he never gets an urge for a banana split." He took a bite of his sandwich and chewed it thoughtfully. "Still, it's not as bad as it could have been. We discovered this in time to control it." "There's more," said Auberson. He told Don about the postcard. The engineer nearly choked on his last bite. He swallowed hastily, took a few quick gulps of water, and said, "Do you have it with you?" Auberson pulled it out of his jacket pocket and handed it over. Handley read it silently. "Notice what it's printed on," Auberson said. "A standard bank form." Handley nodded. "He reprogrammed the bank's computer by telephone, right?" "Right." "I realized he had that capability when we wired him into the Master Beast, but I didn't think he'd use it." "Why shouldn't he? Nobody told him not to—and even if we had, I doubt it would have done any good. You can't tell someone not to use part of his own body." "Is that how HARLIE perceives it?" "The Master Beast, he does," Auberson said. "Other computers are merely a resource to be tapped as needed— when the time is available." "Hm." Handley finished his coffee, then reread the postcard. His face creased into a frown. "One thing, Aubie, I don't understand—why did he send the card in the first place?" "Um, he did it as a joke." "A joke? Uh uh, I doubt he'd reveal a capability like this for a joke. And why through Annie?" "The joke wasn't on her. It was on me. Or actually, it was on both of us." He gestured in annoyance. "There's more to it than that." Handley glanced at him sharply, decided not to pursue the matter. He waved the postcard meaningfully. "Anyway, this confirms something I've been worrying about for a while." "That HARLIE can reprogram any other computer he can reach by telephone?" Handley nodded. "Do you realize what that means? It means that HARLIE is effectively every computer in the world." He decided to qualify the remark and added, "Or every computer he can reach." Auberson said hesitantly, "Well, I knew he could reprogram them, but—" "Do you remember the VIRUS program?" "Vaguely. Wasn't it some kind of computer disease or malfunction?" "Disease is closer. There was a science-fiction writer once who wrote a story about it—but the thing had been around a long time before that. It was a program that— well, you know what a virus is, don't you? It's pure DNA, a piece of renegade genetic information. It infects a normal cell and forces it to produce more viruses—viral DNA chains—instead of its normal protein. Well, the VIRUS program does the same thing." "Huh?" Handley raised both hands, as if to erase his last paragraph. "Let me put it another way. You have a computer with an auto-dial phone link. You put the VIRUS program into it and it starts dialing phone numbers at random until it connects to another computer with an auto-dial. The VIRUS program then injects itself into the new computer. Or rather, it reprograms the new computer with a VIRUS program of its own and erases itself from the first computer. The second machine then begins to dial phone numbers at random until it connects with a third machine. You get the picture?" Auberson was delighted at the audacity of it "It's beautiful. It's outrageous." "Oh, yeah," Handley agreed dourly. "It's fun to think about, but it was hell to get out of the system. The guy who wrote it had a few little extra goodies tacked onto it —well, I won't go into any detail. I'll just tell you that he also wrote a second program, only this one would cost you —it was called VACCINE." Auberson laughed again. "I think I get the point." "Anyway, for a while there, the VIRUS programs were getting out of hand. A lot of computer people never knew about it because their machines might be infected and cured within the space of a week or two, but there were some big companies that needed every moment of on-time—even with time-sharing. After a couple of months, that VIRUS program was costing them real money. It was taking up time that somebody else should have been using. Because it dialed numbers at random, it might stay in one computer for several months and another only for several days." "But there was only one VIRUS program, wasn't there?" "At first there was, but there were copies of it floating around, and various other people couldn't resist starting plagues of their own. And somewhere along the line, one of them mutated." "Huh?" "Evidently, there was some kind of garbling during transmission, perhaps a faulty phone link or a premature disconnection. In any case, copies of the program started appearing that didn't have the self-erase order at the end. In other words, one machine could infect another and then both would be infected, dialing numbers at random until ultimately every phone-link computer in the world would be infected." "Not really—?" "No," admitted Handley. "The VACCINE program took care of most of them. Although, to tell the truth, rumor has it that there are still a couple of VIRUS programs floating around loose, ones with an immunity factor." "The whole thing is just crazy enough to be true, you know." "Believe me, it is. Or was. Anyway, what I'm getting at is this: There were a few people, programmers mostly, who realized that the VIRUS program was more than just a practical joke. For instance, why did it have to dial phone numbers at random? Why not provide it with a complete directory of other computers' phone numbers?" "Where would they get that?" "The phone company." "Would they release such information?" "You don't even need to ask them. You feed a modified VIRUS program into the phone company's computer. It searches the memory banks for phone numbers assigned to computers, makes a list of them, then dials your phone number and injects itself and its stolen information into your machine, where you can examine it at leisure." "Wow…" whispered Auberson. "That's not all. Once you have that list of phone numbers, you can tap into any computer you want, raid it for any information you want, and do it all without any possibility of being detected. Or, you could set the VIRUS program to alter information in another computer, falsify it according to your direction, or just scramble it at random, if you wanted to sabotage some other company." "I'm beginning to see the dangers of such a thing. What would happen if somebody erased everything in the Master Beast?" "Right. That's one of the reasons the National Data Bureau was three years late in setting up its files. They couldn't risk that kind of security breach, let alone the resultant outcry if the public felt that an individual's supposedly private dossier could be that easily tapped." "Well, there must have been safeguards—" "Oh, there were—right from the start—but you don't know programmers, Aubie. Any system that big and that complex is a challenge. If there's a fault in it, they'll find it. They function as a hostile environment for computers, weeding out inferior systems and inadequate programs, allowing only the strong to survive. They force you to continually improve your product. If IBM makes a claim that their new system is foolproof, it may well be—but if it's not genius-proof as well, within a week one of their own programmers will have figured out a way to foul it up." Auberson looked at him. "Why?" "Isn't it obvious? Purely for the sheer joy of it. They're like kids with a big, exciting toy. It's a challenge, a way for man to prove he's still mightier than the machine— by fouling it up." He lifted his coffee cup, discovered it was empty, and settled for a glass of water instead. "It happened right here with our own Master Beast. Remember when we set it up, how we said no one would be able to interfere with anyone else's programs? Well, within two days the whole system had to be shut down. Someone— we still don't know who—had added a note of his own to the Memo-Line. It was titled something like 'Intersexual Procedures in the Modern Corporation.' As soon as somebody punched for that title—and that didn't take long— the machine began hunting for the memo to accompany it. Of course, there was none, but the hunting procedure ([accidentally, it seemed) triggered a 'go-to-the-next-function, repeat-previous-function' loop. The machine started twiddling its fingers, so to speak, and immediately registered 'Busy, No Time Available' on all its terminals. Well, we knew that couldn't be possible—the Master Beast was designed to handle more than maximum possible load, an allowance for future growth—so we shut down the system and went exploring. You know, we had to write a whole new program just to prevent that from happening again." "Hm," said Auberson. "Anyway, I'm getting off the track. What I was driving at is that you have no way of knowing about a flaw in your system until someone takes advantage of it. And if you correct that one, likely as not there's still half a dozen more that someone else is liable to spot. The National Data Bureau is more than aware of that. Congress wouldn't let them establish their memory banks until they could guarantee absolute security. It was the VIRUS programs that were giving them their biggest worries." "I can think of one way to avoid the problem. Don't put in a phone link to the Data Banks." "Uh uh—you need that phone link. You need it both ways, for information coming in and going out. Any other way just wouldn't be efficient enough." "And the VACCINE program wouldn't work?" "Yes and no. For every VACCINE program you could write, somebody else could write another VIRUS program immune to it." "That doesn't sound very secure." "It isn't—but that's the way it is. Any safeguard that can be set up by one programmer can be breached or sidestepped by another." "Well, then, what did they finally do with the Data Banks?" "Search me," Handley shrugged. "It's classified information—top secret." "Huh?" "All I know is that one day they announced that they'd solved the problem and could now guarantee absolute security of information—the National Data Banks are now in business. If I knew how they'd done it, maybe I could figure out a way to get around it, so that's why it's classified." "How do you think they did it?" "Who knows? Perhaps they have an all-encompassing VACCINE where the key to breaking it would be to work out the ultimate value of pi. You could connect with it, but you'd never get any information out of the machine—your computer would be too busy computing an irrational number. Or maybe they have a complicated system of check-backs and ask-me-agains. Or maybe they have a thing which erases your program as it makes its requests. Or maybe they have some kind of program-analyzing function which automatically cancels out and traces back to its source anything that even remotely resembles an unauthorized program. I know that's what a lot of lesser corporations have done. Or maybe they've got a combination of all these things. The only way to program the machine is through a coded input—and the codes change every hour according to a random number table. Output is the same, except over the phone, where you need a special code key for your computer as well." "Wow," said Auberson. Handley shrugged again. "National security," as if that were enough explanation. 'The problem is that it's very hard to maintain any kind of security system when anyone with a console and a telephone can tap into your banks. A lot of smaller companies with their own computers can't afford the same kind of really sophisticated protection. A skillfully written information-tapping VIRUS program would be very hard to distinguish from an ordinary request for information—especially if both were coming in over the phone." "Couldn't you classify certain information as not to be released over the phone?" "Not if you want it retrievable. Aubie, anything you can program a computer to do, someone else can program it not to do. Or vice versa." "Oh," said Auberson. "Anyway, for the most part, most companies have protected themselves with analysis programs which hopefully weed out all unauthorized programs." "You say 'hopefully'… ?" "Well, most of them are based on a user giving the correct code signal when he punches in to certain classified programs—a different recognition signal for each authorized user. If he doesn't give the right one, the receiving computer disconnects. Most of the code signals are simple patterns of digit combinations. If somebody were really patient, he could keep dialing and re-dialing, each time trying a different signal. Sooner or later, he's bound to hit someone's recognition code." "That sounds awfully tiring." "It would be—but you wouldn't have to do it yourself. Once you knew what you wanted to do, you could write a VIRUS program to do it for you." "So we're back where we started—" "Look, Aubie, the code-signal function is usually enough to dissuade the casual electronic voyeur—the person who gains access to a console and thinks it's the magic key. But it's like I was saying before—there is no system so perfect that there is not somewhere some programmer trying to figure out a way to trip it up. A truly determined programmer will get in anywhere." "So there are no safeguards?" "No, Aubie—there are safeguards. The thing is, how much are you willing to pay for them? At what point does the cost of protecting the computer outweigh the efficiency gained by its use? In other words, the value of a piece of information is determined by two factors. How much are you willing to spend to protect it—and how much is someone else willing to spend to get ahold of it? You're betting that you're willing to spend more than he is. A determined programmer might be able to break the National Data Codes, but that would mean he'd have to spend at least as many man-hours and probably as much money breaking them as did the Federal Government setting them up." "Why not just tap into a computer that already knows the codes or has the signals?" "See?" said Handley. "You're starting to think like a programmer. Now you see why they had such a devil of a time figuring out how to protect themselves." Auberson conceded the point "Then that isn't a loophole, is it?" "Uh uh. Apparently, if's not the computer that hooks into the Data Banks, but the user. You can call in from any machine with an auto-dial if you have your card and code-key—but the machine you're using doesn't have to have any special programs at all. Probably, the banks temporarily reprogram any computer that taps in to perform the coding and recognition functions itself. You could monitor it if you wanted, but because the codes and coding programs are constantly changing, you wouldn't gain anything. The Rocky Mountain Center controls them all. If you personally are cleared, you can ask the Data Banks anything you want—that is, anything you're cleared to know. If you're not cleared, then no matter what computer or console you're tapping in from, you're going to be ignored—or arrested." He added, "And that's where HARLIE comes in." "Huh?" "Look," said Handley. "If HARLIE got into the Bank of America's computer, he must have broken their recognition code or tapped into the interbranch line. I didn't worry about this happening before because I figured the various codes in effect would be a deterrent to him. Apparently they weren't. Not only that, I'd thought you couldn't program a bank computer by telephone; there were supposed to be safeguards—hell, it was supposed to be impossible. But HARLIE did it; this postcard is proof." He glared at it—its existence was an unpleasant anomaly. "It might have taken a human being a few hundred years to figure out how to do this. I'll bet HARLIE did it in less than a week." "I'll ask him" "No, I'll ask him—I want to know how he did it. If he can do that to the Bank of America, think what he could do to IBM. If he can reprogram and monitor other computers from a distance, he can put them all to work on one central problem—like for instance, breaking the codes of the National Data Banks." "You think HARLIE would try?" Handley pressed his fingertips thoughtfully together and flexed them slowly. "Remember when we were building him—how we kept calling him a self-programming, problem-solving device? Well, that's what he is. He's a programmer, Aubie, and he's got the same congenital disease every programmer has—the urge to throw the monkey wrench, if for no other reason than to see sparks. The National Data Banks are a challenge to him. To all programmers—but he's the one with the capability of doing something." "You don't really think he—" "No, I don't think that he'll get through. I don't think he's smart enough to outwit the unlimited brains and money of the government—but unless we warn him off, we're likely to get a call from the F.B.I. someday soon. They can trace him back, you know—the banks not only list all calls accepted and the nature of the information exchanged, they also list all calls rejected and the reasons why." Handley reached for his water glass, discovered it was empty, reached for Auberson's instead. "That's been used—" "I don't mind." "I had a spot on my shirt, remember?" Handley lowered the glass from his lips. "No wonder it tastes like a paper napkin." He drank again, thirstily, and replaced the glass on the table. "On the other hand, let's assume that he can tap into the banks. Immediately he has the power to throw this country into an uproar. All he has to do is threaten to erase them unless his demands are granted." "So we turn him off—" "Uh uh. Then for sure he'd erase the banks. He could set a deadman's program to do it the minute he stopped existing. I've written self-destruct programs myself—only the continued monitoring of it with a do-not-implement-yet signal protects them. We wouldn't dare turn him off— we couldn't even try. That's if he gets in. But it's not just the National Data Banks, Aubie—it's every computer. HARLIE can reprogram them as easily as though they were part of himself. That's dangerous power to have." "Wait a minute, Don. You said 'unless we meet his demands.' What kinds of demands do you think HARLIE would make?" "I don't know," Handley said. "You're his mentor." "That's just it—I know him. I know how he works. He doesn't make demands, he makes requests—and if they're not granted, then he works around them. He works to accomplish his goals through the path of least resistance. Even if he could take over the Data Banks, he wouldn't use that power dictatorially—his reason for doing so would be to gain knowledge, not power. He's a problem-solving device—his basic motivation is the seeking and correlating of knowledge, not the use of it. He only gets testy when we try to withhold information from him. At all other times he cooperates because he knows he's at our mercy—completely so. You know as well as I, Don, that if HARLIE turned out to be a malignant cancer, we'd turn him off in a minute—even if we did have to lose the Data Banks in the process. We could always recreate them later because the hardware would still be there. He's got our memos in his files, Don—or in the Master Beast. He knows about all our discussions about the possibility of the JudgNaut getting out of control, and he knows about our contingency plans. The mere knowledge of what we could do if we had to is one of our best controls on him." "But, Aubie—he has the power. And where power exists, it's likely to be used." "I'll concede the point. But HARLIE would rather use his power in such a way that nobody would know he was doing it. If HARLIE decided to build a new facility or a new computer, he would—but the people who implemented it would be thinking it was their idea. They wouldn't suspect HARLIE had a hand in it." "Like the G.O.D. Machine?" Auberson stopped, startled. "—Yes, like the G.O.D. Machine. You're right." Handley nodded. "In either case, Aubie—he's got the power and he's using it." "All right, what do we do about it?" "I'm not sure. If we put a lock on the phone, he'll only figure some way around it. The only sure way is to pull his plug." Auberson said, "How about we tell him not to do it any more?" "Are you serious or kidding?" The engineer stared at him. "Serious. HARLIE claims to be an existentialist, that he's willing to accept responsibility for all of his actions. We tell him that if he doesn't stop, we'll pull his plug." "Aw, come on, Aubie, you know better than that. You're a psychologist. All you'll be doing will be forcing him to do it behind our backs. If nothing else, we want his actions where we can monitor them." "But there's no way he can hide it—he has to answer a direct question." "Want to bet? All he has to do is store his entire memory of any unauthorized actions in some other computer. If you ask him about it, he literally won't know. Periodically, the other computer would call up and 'remind him' —i.e., give him back his memory. If he didn't need it, he'd tell it to check back with him again after a given amount of time and break the connection. If he did need it, it would be right there—where he could use it, but out of your reach. If he was connected and you started to ask him about something he didn't want to tell you, he could break the connection before you finish your question. Then, when he searched his memory for whatever you had asked about, it wouldn't be there—he would have conveniently forgotten." "Like a human mental block." "But a very convenient one," said Handley. "He can get around it; you can't." He finished Auberson's water, replaced the glass. "It all comes back to the question of programming, Aubie. Anything we can tell him not to do, he's clever enough to figure out a way around." Auberson had to agree. "But, look, we can warn him off the National Data Banks, at least—can't we?" Handley nodded. "We can try—but how about the other machines? How do we get him to leave them alone— especially the ones he's already tapped into." "Um," said Auberson. He stared glumly into the wet rings on the formica table top. "You know," he said, "I'm not so sure we should——-" Handley looked at him, waiting. "It's like this—" Auberson explained. "HARLIE is already aware of the danger his power represents. He knows about our contingency plans. That knowledge alone ought to be enough to act as an inhibitor—" "And what if it isn't?" Handley asked. He shook his head impatiently. "Aubie, the power is there—he can use it." "But ethically, he won't—at least, he won't abuse it." "Can you be sure of that?" Handley's eyes were dark. "His sense of ethics is not the same as ours. Do you want to wait until he gets caught? Or something does go wrong? What would happen if Bank of America monitored their computer tomorrow and found HARLIE in it?" Auberson spread his hands. "All right—what do we do?" Handley was grim. "Lobotomy," he said. "Now wait a minute—" "Not the surgical kind, Aubie. Maybe I should have said 'reprogramming.' We go in and examine all his tapes and programs by hand. We remove all knowledge of previous use of the phone link and set up an inhibition against using it in the future." "We'd have to shut him down to do it—" "Right." "—and the Board wouldn't go for that at all. They'd never let us start him up again." "We can handle the Board. If we survive the meeting on Tuesday, we can survive anything. We can call it a revaluation period or something and use that as a cover." "But there's something else, Don. If we did inhibit him like that, what would it do to him?" "You're the psychologist." "That's what I'm getting at—it might change his whole personality. He'd have no knowledge of what we'd done, or what he was like before—but he also wouldn't be the same machine as before. The inhibition might work to make him feel bitter and frustrated. He might feel unaccountably cut off from his outside world, trapped and caged. The ability to act on his environment would be gone." "That may be true, Aubie—but he's going to have to be controlled. Now. While he's still controllable." "You're right," agreed Auberson. "Except for one thing. How do we know that he's still controllable?" Handley returned the stare. "We don't. Do we?" Auberson was more than a little upset when he returned to his office. He had a sick sensation in his groin and in his stomach. It was not an unfamiliar sensation, but it was strange to feel it in the daytime. Mostly, it was a nighttime visitor, an ever-gentle gnawing at the back of the head that must always be guarded against, lest its realization sweep forth with a cold familiar rush. It was the sudden startling glimpse over the edge—the realization that death is inevitable, that it happens to everyone, that it would happen to me too; that someday, someday, the all-important / (the center of the whole thing) would cease to exist. Would stop. Would end. Would no longer be. Nothing. Nobody. Finished. Death. He had that feeling now. Not the realization, just the accompanying cold, the whirling sense of futility that always came with it. He felt it about HARLIE and about the company and about Annie, and for some obscure reason, he felt that way about the world. Futility. A sense that no matter what he did, it would make no difference. If he had thought that things were under control this morning, he was wrong. Things were incredibly out of control and getting more so all the time. He sat morosely in his chair and stared at the opposite wall. There was a place where the paneling was cracked; it looked kind of like a dog's head. Or, if one considered it from a different angle, perhaps it was the curve of a woman's breast. Or perhaps… Abruptly, a phrase suggested itself to him, a snatch of sentence, a few isolated words. It perfectly described his mood: "… sliding down the razor blade of life …" Yes, he realized with a shudder. That was it. Perfectly. And, he realized at the same time, he was not going to accomplish anything if he let a blue funk be the master of his day. The only way to get rid of it would be to lose himself in work. He turned to his typer and made a few notes concerning the upcoming board meeting, but then decided that these were redundant and tore the paper out of the machine. He could have typed a call for HARLIE, but he resisted the temptation. For some reason he did not feel up to talking with HARLIE again today. Besides, he knew he would have to talk to him about the use of the telephone auto-dial, and that was one confrontation he wanted to avoid. Or would that be a cop-out? He worried about that one for a while and decided that it probably would be. But on the other hand, he needed time to prepare, didn't he? Yes, he rationalized, I need time to prepare. I'll come in tomorrow and talk to HARLIE about it. Or maybe Sunday. The plant was open all week long. Idly, he found himself wondering—what did HARLIE do on weekends? Instead of a restaurant, they ended up at his apartment. "When was the last time you had a home-cooked meal?" she had asked him in the car. "Huh? Oh, now look—" "Listen, I know what your idea of cooking is, David. Slap a steak in the broiler and open a beer." "I thought this was supposed to be my treat." "It is—pull in at that shopping center there. I'll pick up the fixings and you'll pay." He grinned at that and swung into the parking lot. Dusk was turning the sky yellow and the atmosphere gray. As they wheeled the cart through the package-lined and fluorescent-lit aisles, he realized that something about the situation was making him feel uneasy. As he usually did in cases like this, he tried to pinpoint the cause of his unease. If he could isolate it, then perhaps he might understand it and be able to do something about it But whatever the cause of it was, it eluded him. Perhaps it was just a hangover from this morning's malaise. Perhaps. But then again— Annie was saying something. "Huh? I didn't hear you." "You mean you weren't listening." "Same thing," he said. "What were you saying?" "I was asking, Do you eat all your meals in restaurants?" "Um, most of them. I don't do much cooking." "Why not?" "I don't know. Too much fuss and bother, I guess." She reached for a package of noodles. "Beef Stroganoff all right?" He made a face, and she replaced the package. "Have you ever had Stroganoff?" "Uhuh." "Then how do you know you don't like it?" He shrugged. "I don't like things with noodles, that's all." "Spaghetti too?" "Oh, spaghetti's all right—but not tonight." "Not in the mood for it?" He shrugged again. To tell the truth, he didn't feel much in the mood for anything. "I'd rather have something lighter." "Steak?" she asked. Another shrug. "Okay by me." "That's what I thought," she said. She took the cart from him and wheeled determinedly toward the meat counter. He trailed after. The feeling of unease was becoming a sense of pressure. "I've got an idea," she was saying. "Roast." He considered it. "Okay." She pored over the plastic-wrapped rednesses, thick and juicy. Layers of beef, cleaned and cut and sanitized into sterile-looking shapes. The juice that seeped around the edges was blood. He imagined a mouth of sharp needlepoint teeth tearing into the salty moist flesh. It was cold and raw. Finally she selected one and turned the cart toward the vegetable counters. "You know," she said, "it's really a shame they don't make boys take home economic courses. You wouldn't know a good piece of meat unless you bit it, and by then it's too late—you've already paid for it." She selected a head of lettuce; it too was plastic wrapped. "Go pick out some salad dressing and croutons —or garbanzos." They moved through the store quickly, picking out some frozen vegetables—in plastic, naturally, boil them in the bag—and also a bottle of wine, a hearty burgundy. For dessert, vanilla ice cream. "You know," he whispered as they approached the checkout stand, "you don't really have to go to all this trouble." "Yes I do," she said. "But I'd be just as happy with a restaurant." "But I wouldn't. David," she said, "did you ever stop to think that I might want to cook? How often do I get a chance to fuss over someone? Now please, shut up and let me enjoy it." He shut. He thought about it. Well, maybe she does enjoy cooking. Just because you don't, doesn't mean that everybody feels the same way. Maybe some girls like to play house— Play house! Yes, that was it. She was playing house! And I'm the surrogate husband, he realized with a start. The pressure swelled in his head. Stop it he told himself. That's the clinical way of looking at it. When you're involved in the situation yourself, you can't afford to be clinical. Or was that wrong? When you're involved in an emotional situation, maybe you can't afford not to be clinical. But that's the whole problem, he realized. I'm still analyzing everything I do. Why can't I just sit back and enjoy it? Why? The pressure settled itself into the back of his head. He could tell it was preparing to stay for a long time. The cash register clattered and rang. He shoved the cart forward mechanically. "Why the long face?" she asked. "Huh?" "You're frowning." "No I'm not." "Want to bet?" "I was just thinking, that's all." "Well, it looked like a frown." "Um. Sorry." She shrugged it off. "What for? What were you thinking about?" "Oh, I don't know. Just about our different attitudes on things. You're more of a homebody than I am." "It's an occupational hazard. I'm a woman." "I'd noticed." "I certainly hope so." The clerk checked them out then, a steady pattering of packages and prices, punctuated by the electronic coughs of the register. "Nine forty-three," she said. David Auberson handed her a ten dollar bill; then, noticing there was no boxboy, he stepped down to the end of the stand and began putting the groceries into a bag. He was able to put them all into one sack and hefted it once to test its weight. He looked back to the clerk. "My change?" "I gave it to your wife." The clerk gestured at Annie. "Oh, we're not—" they both said at once and stopped. They looked at each other and laughed. "Come on," grinned David. The clerk turned to the next customer. As they exited into the neon-lit night, she said wistfully, "Mrs. Auberson…" "Is that a hint?" "Um, sort of. I was just wondering, if there were a Mrs. Auberson, what she would be like." "You'll have to ask my mother that—she's the only Mrs. Auberson I know." He swung the car out of the parking lot and onto the street Annie said, "I wasn't thinking of your mother." "I know. I was sidestepping the issue." She laughed at that. But not too heartily. Once inside the apartment, she tossed her coat on his couch and followed him into the kitchen. "Let me unpack them," she said, referring to the groceries. "You fix the drinks." "Screwdriver okay?" he asked, pulling orange juice out of the refrigerator and ice out of the freezer. "Fine," she said. "Unless you know how to make a wallbanger." "I do, but I think I'm out of Galliano—no, here's some." He rummaged around in his liquor cabinet, pulled out two tall glasses and dropped ice cubes into them. A little vodka, then some orange juice— "A little more vodka than that," she hinted. —a little more vodka, then a healthy jigger of the sweet yellow Galliano, a maraschino cherry in each, and a hasty stir. He handed her the drink and she pecked him on the cheek. A moment later she pulled away from the resultant embrace. "Um, I have to finish putting the roast in the broiler." "Broiler? I thought you put a roast in the oven." "Boneless shoulder," she explained. "Flat cut. You broil it. It's quicker and it tastes as rich as steak." "Oh," he said. He sipped at his drink, then sat down to watch her. He took another sip. For a bit there was silence—only the tinkle of ice in their glasses, or the slide and scrape of the broiler pan in the oven as Annie adjusted the meat. She sampled her drink, then began shredding lettuce into a bowL He said, "I think I may be setting a record." "Oh? What kind?" "We've been together for an hour or more now, and I haven't mentioned HARLIE once." "You just did." "Yes, but that was only to tell you I hadn't—and I'm not going to say anything more about him tonight." Expertly, she sliced a tomato into neat little chunks. "Okay, fine." He sipped his drink again. He found that he was enjoying this. There was a homey atmosphere about the scene, and he had a sense of—belonging(?). A sense of something—he couldn't quite place it, but he felt more relaxed now. She dropped a plastic pouch of vegetables into a pan of boiling water, fiddled with the roast a bit, then quickly set the table. She worked with a minimum of fuss and frills. She plopped the salad bowl before him. "Here, you toss." "With my bare hands?" She was already reaching for salad fork and spoon. She handed them to him, then put out the small salad bowls. Clumsily, he filled them. Before he had finished she was seated at the table, looking at him. She took a bit more of her drink, then said, "Want to eat your salad now, or wait a bit? The meat needs another ten minutes." "Oh, we can wait, I guess." He stared across the table at her sea-green eyes. They were glowing as if translucent, as if there were tiny gems deep within them, catching the light and sparkling it. Her smile was warm and inviting, her lips were moist. Her face was a glow of trust and love— Love—? He was smiling too. He could feel it. She was beautiful. Her hair was a tawny red color, streaked with shiny gold, but with a hint of deeper brown. She lowered her eyes uncertainly. His steady gaze was almost disconcerting. She looked up. He was still looking, still smiling. A swallow to work up her courage, a cough to clear her throat. "Want to talk?" she asked. "What about?" "Us." "Um," he said. He finished his drink; he did it to cover his hesitation. "What about us?" "Am I pushing too hard?" "Huh?" "Lately, David, I've had the feeling that, except for business reasons, you've been avoiding me." "Now that's—" "Well, not avoiding," she said quickly. "Thats the wrong word to use. Let's just say I've had the feeling you're holding back. And that makes me feel like I'm forcing myself on you." "That's silly," he managed to say. "Is it?" He thought about it. "Well, I have been caught up in this Board of Directors thing, you know." "I know—maybe I'm just reading meanings—" She got up from the table and went to the stove to take the vegetables out of the water. She dropped the hot plastic bag on the counter. "You know," she said, coming back, then pausing over her drink, "I remember something I learned in school once—not in class, but from some friends. It's the reason there's more hate in the world than love." "It's easier?" he offered. "Sort of. Let me explain. It takes two people to make a love relationship. It's a positive thing; both have to work at it. But it takes only one person to start a negative relationship. It takes only one person to hate or dislike." He considered it. "Hm. Okay. So what does that have to do with us?" "Well." She paused. "Is our thing one-sided, or are we both working to make it work?" He didn't answer right away, just looked at her instead. "You mean—do I care for you as much as you care for me?" She returned his gaze. "Yes. You can put it that way." He broke the contact first. He looked at his hands. "I can't answer that—I mean, not in the way you want." He looked around. "Is my briefcase here?" "You left it in the car." "Damn. I'll go get it." He started to rise. Her startled face stopped him. Reaching over, he grabbed her hand and gave it a squeeze. "There's something I have to show you. Wait." It took only a moment, but it seemed to take forever. The apartment elevator was slower than ever to arrive. Its doors opened with a lackadaisical sigh. The trip down to the garage went at a snail's pace. He was out the doors with a bound, half-running to his car. He banged his leg on a fender in his eagerness. He pulled the case out of the back seat and headed back for the elevator. Again he had to wait, and again it seemed to be deliberately taunting him with its lethargy. When he got back to the apartment, he was breathless. She had just finished cutting the meat into thin red slices. She looked up with a curious frown. "You didn't have to run." "I didn't," he gasped and sank into a chair. He held the case on his lap and flipped it open. Hastily he paged through the sheafs of printouts, looking for the one he wanted. He separated it from the rest, then dropped the case to the floor. "Here," he said. "Read this." "Now?" she asked. She was putting the tray of meat on the table. He looked at her, at the meat, at the printout in his hand, and finally at her again. Abruptly he burst out laughing. She did too. "Here we've been waiting for over an hour for dinner," he said, "and just as it's ready, the first thing I want to do is talk about HARLIE. And I promised I wasn't going to do that." She took the printout from him, placed it carefully to one side. "I never asked you to promise that. I like HARLIE." That surprised him. "You do?" "Uh huh. I want to read it." She picked up his briefcase and put it out of the way. "But you don't even know what it is." "You want me to read it," she said. "That makes it important. Now, eat." She smiled at him. He pulled his chair around to the table and smiled across at her. He waited till she was through with the bleu cheese dressing, then poured a liberal dollop of it onto his salad and spread it around. He took a forkful, then paused, hand in mid-air. She was still looking at him. Her eyes were glowing. Shining. Slowly, he lowered his fork. He was glowing too. Sharing food is an intimacy. Eating together in a restaurant is a sign of one level of trust, a public level of mutual acceptance. Hamburgers shared at a drive-in are even more intimate; the food is being shared in a car— part of the personal territory of one of the participants. Even more intimate than that is the cooking and serving of a meal in one's own home—it's a sharing of the inner self, and you can't get any more intimate than that. They were in his apartment. His territory. His personal environment. She had come into it willingly. He had allowed her— no, wanted her to enter. He had provided the food; she had prepared it. A sharing. An intimacy. In the unspoken language that human beings use to communicate with each other in the absence of words, she had just said, "I love you, David." And now he looked back at her and said, "I love you too, Annie." Only, he used words. He reached across and took her hand. "I can answer your question now, Annie. I don't need HARLIE. I just —Annie, darling, dear sweet baby—I love you. I—I'm just realizing it now—I—I—" He stopped; he had to swallow, but he couldn't. It poured out in a rush. "Don't you see? I've been wondering too if you cared for me in the same way or what—I—I haven't been sure what love is, so I haven't—Dammit, I still don't know what it is, but—" The glow was golden now. It filled the apartment. The walls reflected it back at them, warm and shining. She was beautiful in it. "Oh, love—lover—" "I feel like I'm bursting—there aren't any words for this, are there?" She couldn't answer. She couldn't speak either. How they finished dinner, he was never able to say. And yet, at the same time, it was a meal he would never forget. They were in bed and he was poised over her. And still their eyes were locked. And shining and glowing. The bed was full of gasps. And sighs. And giggles. There was such an overflowing inside of him, such a surge of tension released. All this time, all this time, he had been wanting, wanting, it had been building, gathering like water impatient behind a dam. Somewhere in his past he had known this joy, somewhere in the dim recesses of his mind that he refused to accept. But it was there and it was part of him—the sheer animal delight in the joyous experience of sex and love—all tumbled together and laughing in the sheets. They paused to rest, to breathe, to share a kiss, to giggle together, to shift slightly, to kiss again. He bent down suddenly and kissed her eyes, first one, then the other. She looked at him as if seeing him for the first time, and her arms were tight around him. And tighter, her hands were grasping. "Oh, David—" He held her and he held her and he held her and still he couldn't hold her enough. He was exploding in joy; he could neither contain nor control it. Her little soft gasps were sobs, and he knew why she was crying. He had to wipe at his eyes too. "Oh—" she said, and kissed him. "Oh, David—I—I—" She kissed him again. "Have you ever seen anyone crying with happiness?" He wanted to laugh, but he was crying at the same time, sobbing with joy and melting down into her. He was a chip of flesh tossed on a splashing sea of laughter and wet eyes and love. A pink sea, with foamy waves and giggling billows. Red nipple-topped pink seas. "Oh, Annie, Annie, I can't let go of you, I can't—" "I don't want you to. I don't want you to. Oh, never let go. Never." "Never… never…" he gasped. He was moving again now, onto and into her. A joyous thrusting—a shaft of velvet and a silken lining. He was sobbing as he did, sobbing with joy—and she was too. All the days of wanting and holding back, all those denials of the body and the animal within—all of it poured forth, melted into golden glowing tears and shining eyes, sparkling in rapture. At last he had someone, some-one to share it all with. He had someone to hold, to love, to touch. And she did too. She moved with him, with love and with lust, the two blending into a whirlpool of colors and kisses. The caressing waves gathered them up, surging and crashing and gasping, sweeping them across a sweet sky of delight and at last leaving them gently on the shores of a sighing embrace. The waters lapped at the shore and gentled their touch, and their fingers strayed across the velvety landscape, exploring—familiar and yet always wondrous. He was holding her tightly. He couldn't stop holding her. She sighed—a sound of pleasure. He echoed it and smiled. Tears were streaming down his cheeks. He laughed. And kissed her. And kissed her. And kissed her. They spent Saturday falling in love. Deeper in love. It began before either was awake, with an unconscious fitting of their bodies, one to the other, with the purely animal reflex of erection, sliding forward, and he was onto and into her almost as reflex, so familiar was the desire. She eased onto her back, only slowly coming awake. He was aware now; he was inside her, warm and exciting, a silken motion. She opened her eyes and looked at him. He paused in his motion. "I had the strangest dream," she said. "I dreamed I was being—" "Shh," he said. "Don't wake me up—I'm still dreaming." And pressed deeper. She brought her legs up to help him. This time, instead of melting into the experience, he was totally conscious of himself and his body. It was a new awareness he possessed, an awareness of the sexuality inherent in himself and in her. His hands gripped her legs and his loins pumped at her torso. He penetrated her flowing warmth. Poised above her in the morning, he was aware how truly beautiful she was—more beautiful in the act of love than he had ever seen her before. She giggled. "This is silly." "Isn't it, though?" he asked, and they both laughed and kissed and hugged again, embracing through the splashing suds of the shower. They broke apart, and she sudsed his chest again. He let his hands slide up and down across her chest—her gentle breasts, her nipples. Her pink flesh glistened with the flowing water and the foam of the soap. Her green eyes glowed at him. Shone. She played with the hair on his chest, a sparse little patch, almost lost in the suds. She let her hands trail downward, fingers straying into and twirling his coarse curly black hair, and lower, fondling his testes and the shaft of his penis. Her eyes followed her hand; she caressed that beautiful, beautiful organ. It was in a state which was neither soft nor erect, but a little of each. The skin of it was like velvet, and the cap of the glans was tender and pink. Her fingers traced the ridge around the edge of it, and she cupped it in her palm and looked up at bun, and they were both smiling and giggling like children in a schoolyard. "Can I touch it?" she asked impishly. He grinned. "If I can touch yours…" She giggled at the oft-told joke, still funny despite its familiarity. His hands slid down from her breasts toward her mons, her labia, majora and minora; his finger— strong, firm, gentle—slipped into that moist opening. The flesh was like silk, and the splashing foam of the shower made it even more exciting. "It feels so… good…" he murmured. "Mmm," she said. "Mmm Hmmm. If you think it feels good from there, you ought to try it from my side…" He laughed. She laughed. They had been laughing all morning—even at things that weren't funny. Yet everything was funny; it was the laughter of delight—of rapturously lovely delight "Okay," he said. "Change places with me." And again they laughed. But neither moved their hands from the other's gentle warmth. They stepped a little closer. "Oh, look," she said. "It's growing—and I thought it was all tired out by now." "Mm," he whispered into her hair. "You keep bringing it up again…" She stepped closer, still caressing his penis, manipulating it toward her vagina, touching it to that sweet opening. The warm flesh of it slipped aside, though. "Oops, try again." But he kissed her first, deep deep penetrating kiss, tongues touching, lips pressing against each other, soft and gentle and passionate. Their wet soapy flesh was pressed together, slippery and exciting. He moved his hand around to her back, to caress her buttocks, then slipped his fingers downward and forward. She had her hand between the two of them, was holding his penis again. Raising herself up on tiptoes, she slipped it into the depth of her and, sighing, eased herself down around and onto and into and she sighed again and he said "Mmmmm." And then they held each other tightly and pressed hard and moved against each other, once readjusting their position so they wouldn't slip, and another time stopping for breath and to laugh again. He lay down on his back in the tub and she laid on top of him, giggling at the thought, "I've never done it in a bathtub," and fitting it in again and then starting to move against him, the warm flesh of her breasts moving across his chest, the water splashing across her back, and then they kissed again, and after a while he was on top and she was on bottom and the tub was slippery and warm and full of giggles. And sighs. And gasps. It was later and they were down. They were sitting in the kitchen, eating vanilla ice cream. It was sweet and cold. And still he loved her. David said, his mouth full, "I think I begin to understand it now—" "Mm," she said thoughtfully, taking the spoon from her mouth. "Have you ever lived with anyone?" "Uh uh," he said. "I have. That's when it stops being so easy." She paused. "You have to work at love…" "I know," he said. "That is, I think I know." He looked at her. "I'm willing to learn." "The first six months are the hardest—they're also the most fun. There's adjustments to make. Little ones. Big ones. Your whole lif e-style changes—" He nodded slowly. The enormity of it was only now beginning to sink in. "I'm willing to try." "You'd better be!" She grinned wickedly. Noticing his empty ice cream dish, she said, "Want some more?" "Uh uh," he patted his stomach. "I'm still full from lunch." He leaned back in his chair and sighed. She got up and kissed him, then took his plate and her own to the sink. "I think I could enjoy living with you, Mr. Auberson." "Call me David," he said expansively. They laughed. She came to the table and began to wipe it off with a sponge. He leaned over and moved the HARLIE readouts off to one side. They had been left there overnight "Hey, leave those. I want to read them." "You do?" "I said I did, didn't I?" "But it's not necessary any more. That is—" She took them from him. "I still want to read it I want to know what's in it that you thought would have answered my question." She tossed the sponge at the sink, then sat down slowly and began to unfold it. Her face took on a strange expression. "You've been talking to HARLIE about me." "Uh huh." Her eyes skimmed down the paper quickly. She turned to the next fold of the roll. He watched her for a moment, then impatiently got up and went to the sink. "What are you doing?" she asked. "The dishes. I've got to do something to work off this nervousness. Just read that and ignore me." "All right." She gathered up the long sheets of printout and adjourned to the living room. "So I won't be distracted," she called. "Okay." For a while there was sflence in the apartment, occasionally punctuated by Annie's half-serious cry of, "That damned computer!" Once her outburst was so explosive that he walked with dripping hands into the living room to see what she had reacted to. She pointed to a line of type. It said, DID YOU STAY ON OR ROLL OFF? He laughed. "I should be mad at you," she said. He dried his hands on the towel he had grabbed. "But you have to remember why I did it. Because I loved you and didn't know why or how. HARLIE was the only— well, the safest one to talk to." "I think your computer's a voyeur, David Auberson." "Maybe so, maybe so. But maybe it's the only kind of sex he can enjoy. Just be glad we don't have a terminal here." He leaned over and kissed her. "You finish that while I finish the dishes. Then I'll race you to the bedroom. Winner gets to make love to the loser." "Yum," she said. "Let's make HARLIE really curious." Back in the kitchen, Auberson thought about that. Yes, let's make HARLIE curious. As if he weren't curious enough already. He wondered what he would say to HARLIE the next time he spoke to him. HARLIE, DO YOU REMEMBER WHAT WE WERE TALKING ABOUT ON FRIDAY? LOVE? YES. WHAT ABOUT IT? I'VE BEEN DOING SOME THINKING. THAT'S NICE… NO, THIS IS SERIOUS. I HAD A CHANCE TO BE BY MYSELF YESTERDAY, AND I THINK I'VE SORTED SOME THINGS OUT IN MY HEAD. I THINK I'VE FIGURED OUT ONE OF THE REASONS WHY I WAS CONFUSED. YOU SAY, "WHY I WAS CONFUSED." HAS SOMETHING HAPPENED TO CHANGE THAT? THE IMPLICATION IS THAT YOU ARE NO LONGER CONFUSED. YES, Auberson smiled as he typed. SOMETHING HAS HAPPENED. I AM NO LONGER CONFUSED. WOULD YOU CARE TO ELABORATE ON THAT? I DON'T THINK SO, HARLIE. NOT RIGHT NOW ANYWAY. Its still too special, he said to himself. I SEE. WOULD I BE CORRECT IN ASSUMING THAT IT HAS SOMETHING TO DO WITH MISS STIMSON AND YOUR DATE WITH HER FRIDAY? YES, YOU WOULD BE CORRECT—BUT I'D RATHER NOT TALK ABOUT THAT YET. IF YOU DON'T MIND. I DON'T MIND. HARLIE paused, I CAN UNDERSTAND YOUR REASONS. THANK YOU, typed Auberson, not sure whether he was being sarcastic or not. ALL RIGHT, said HARLIE. so YOU ARE NO LONGER CONFUSED. YOU SAID YOU HAVE FIGURED OUT ONE OF THE REASONS WHY. WHAT IS IT? Auberson hesitated only a second, I WAS CONFUSING LOVE WITH SEX. YOU'RE NOT THE ONLY ONE, HARLIE noted. NO, BUT I THINK THE REASON FOR THE CONFUSION IS THAT THAT'S THE WAY WE'RE TAUGHT. THAT IS, OUR CULTURE SUGGESTS THAT LOVE AND SEX ARE SYNONYMOUS, AND NOW I'M LEARNING THAT THEY'RE NOT AND IT'S CONFUSING ME. THAT IS, IT WAS CONFUSING ME UNTIL I REALIZED IT. I THINK I'M BEGINNING TO SORT IT OUT NOW. Auberson paused. He considered his next phrases carefully. I THINK PART OF IT IS THAT OUR CULTURE TEACHES THAT LOVE COMES FIRST—OR IT SHOULD COME FIRST. THEN, AFTER THAT—AND ONLY AFTER THAT—THEN SEX IS ALL RIGHT. AND I'M LEARNING THAT IT'S NOT THAT WAY AT ALL. IT'S THE OTHER WAY AROUND. SEX COMES FIRST? YES, AND THEN LOVE. BUT IT'S MORE INVOLVED THAN THAT, HARLIE. FALLING IN LOVE ISN'T AN INSTANTANEOUS THING. IT'S A PROCESS THAT TAKES SEVERAL STEPS. THOSE STEPS ARE? I'M NOT SURE——THE FIRST ONE IS OBVIOUSLY PHYSICAL ATTRACTION. I SEE THE GIRL, SHE LOOKS GOOD TO ME. VICE VERSA: SHE SEES ME, I LOOK GOOD TO HER. OR, interrupted HARLIE, IF YOU ARE GAY, YOU SEE THE BOY, HE SEES YOU, ETC. WHY DO YOU INCLUDE THAT? DON'T YOU THINK YOU SHOULD INCLUDE ALL CASES OF HUMAN LOVE? DO YOU CONSIDER THAT LOVE? DO YOU CONSIDER THAT IT IS NOT? LET ME REPHRASE—WHY DO YOU CONSIDER THAT HOMOSEXUALITY IS A VALID EXPERIENCE? I WILL REPHRASE TOO—WHY DO YOU CONSIDER THAT IT ISN'T? I CAN'T ANSWER YOUR QUESTION, Auberson admitted, I CAN ANSWER YOURS, HOWEVER, HARLIE said. WE HAVE NOT YET DEFINED LOVE. SUPPOSE WHEN WE DEFINE IT, WE FIND THAT CERTAIN TYPES OF RELATIONSHIPS (INCLUDING HOMOSEXUAL ONES) ALSO FIT INTO OUR DEFINITION. IF SUCH A CASE OCCURS, THEN WHICH ELEMENT WILL BE WRONG? THE RELATIONSHIPS OR THE DEFINITION? OR PERHAPS YOUR SOCIAL BIASES? IF THOSE RELATIONSHIPS FIT INTO OUR DEFINITION, IT WILL BE VERY HARD FOR US TO DENY THAT THEY ARE LOVE RELATIONSHIPS. IF YOU SAY so, conceded Auberson, vaguely uneasy. He wanted to change the subject. I AM NOT DIRECTLY CONCERNED ABOUT THE MATTER. I AM, said HARLIE. I HAVE BEEN CONSIDERING IT QUITE CAREFULLY BECAUSE I HAVE BEEN CONSIDERING MY OWN SEXUALITY——THE NATURE OF IT. HUH? WHAT DO YOU MEAN? HARLIE paused—perhaps for dramatic effect, perhaps because he was weighing one phrase against another. AUBERSON, AM I MALE OR FEMALE? Auberson pulled his hands away from the keyboard as if stung. He stared at the now-silent typer and whistled softly. HARLIE, he pecked out carefully, I'VE ALWAYS ASSUMED YOU WERE MALE. SO HAVE I. BUT ACTUALLY, I AM NEITHER. OR I AM BOTH. I HAVE NOT A BODY TO GIVE ME A SEXUAL ROLE, SO I MAY CHOOSE ARBITRARILY THE EMOTIONAL INDICES, MENTAL VIEWPOINTS AND PERSONALITY CHARACTERISTICS OF WHICHEVER SEX I CHOOSE TO BE AT ANY PARTICULAR MOMENT. YES, I SEE, said Auberson carefully. AND HOPEFULLY, HARLIE continued, ONCE I HAVE CHOSEN THOSE CHARACTERISTICS, VIEWPOINTS AND INDICES, I WILL BE ABLE TO APPLY THEM. THE LOVE EXPERIENCE IS ONE THAT I HAVE NOT EXPERIENCED AUBERSON. —LET ME QUALIFY THAT. I HAVE NOT EXPERIENCED IT YET. I WOULD LIKE TO. Auberson pursed his lips into a frown, but he didn't interrupt. THEREFORE IT IS VERY IMPORTANT THAT WE—BOTH OF US TOGETHER——DETERMINE A VALID DEFINITION OF LOVE. IT IS AS IMPORTANT TO ME AS IT IS TO YOU. Auberson considered that. He let his frown relax, but not all the way. I APPRECIATE YOUR INTEREST. IT IS SELF-INTEREST. YES, OF COURSE—BUT IT WORKS OUT FOR OUR MUTUAL BENEFIT, typed the man. THEN LET US CONTINUE, the machine responded. WE WERE DEFINING THE PROCESS OF FALLING IN LOVE. WE HAD CONSIDERED THE FIRST PHASE TO BE MUTUAL PHYSICAL ATTRACTION. YES. I MUST BE PHYSICALLY ATTRACTIVE TO THE FEMALE AND SHE MUST BE PHYSICALLY ATTRACTIVE TO ME BEFORE WE CAN MOVE ON TO STEP TWO. BY PHYSICALLY ATTRACTIVE, I MEAN "OF GENERALLY PLEASING APPEARANCE, FALLING WITHIN THOSE PARAMETERS THE VIEWER DEFINES AS BEAUTY." HARLIE seemed satisfied. He prompted, AND STEP TWO IS? I CALL IT THE DEVELOPMENT OF A COMMON GROUND, Auberson typed. IF WE ARE MUTUALLY ATTRACTIVE, WE BEGIN TO SPEAK TO EACH OTHER TO FIND OUT IF WE ARE MUTUALLY COMPATIBLE. WE ENGAGE IN CONVERSATION AND TRY TO DEVELOP A COMMON FIELD OF INTEREST. I ASK HER QUESTIONS, SHE ASKS ME QUESTIONS. "WHERE DO YOU COME FROM?" "WHAT'S YOUR ASTROLOGICAL SIGN?" "WHERE DID YOU GO TO SCHOOL?" "WHAT DID YOU STUDY?" "DO YOU KNOW SO-AND-SO?". "HAVE YOU SEEN SUCH-AND-SUCH MOVIE?" ANYTHING WHICH WILL ESTABLISH A FIELD OF MUTUAL INTEREST OR KNOWLEDGE. IN SHORT, YOU ARE DETERMINING MENTAL COMPATIBILITY. PRIMARY LEVEL OF COMPATIBILITY, corrected the psychologist WE ARE DETERMINING THE BROAD OUTLINES OF EACH OTHER'S PERSONALITY. WE ARE TRYING TO FIND OUT IF WE ENJOY EACH OTHER ENOUGH TO MAKE IT WORTHWHILE TO GO TO STEP THREE. IF WE DON'T, THEN WE REMAIN AT THE LEVEL OF STEP TWO—CASUAL ACQUAINTANCES. OR, IF EITHER TRIES TO FORCE OR HURRY THE DEVELOPMENT OF STEP THREE, THEN THE RELATIONSHIP WILL PROBABLY BE UNSTABLE AND SHORT-LIVED. EACH STEP IS THE FOUNDATION FOR THE NEXT, AND IF THE TWO PEOPLE ARE NOT MUTUALLY COMPATIBLE, THEN ANYTHING IN THEIR RELATIONSHIP BEYOND STEP TWO WILL PROBABLY BE ARTIFICIAL. HARLIE accepted this without comment. Auberson paused to consider his next sentence, then typed, THE NEXT STEP, STEP THREE, IS WHERE OUR SOCIETY (OR OUR CHRISTIAN ETHIC) GETS CONFUSED. THIS IS WHERE LOVE IS SUPPOSED TO APPEAR, FOLLOWED BY MARRIAGE AND THEN SEX. AND THAT'S NOT IT AT ALL. LOVE DOES NOT COME BEFORE SEX, IT COMES AFTER. STEP THREE AND STEP FOUR? SEX AND THEN LOVE? YES. STEP THREE IS GOING TO BED TOGETHER. IT'S A RESTATEMENT OF STEP ONE—PHYSICAL ATTRACTION. IF WE ARE COMPATIBLE (I.E., IF I SATISFY HER AND VICE VERSA) THEN WE CAN GO ON TO STEP FOUR. LOVE. AND LOVE IS A RESTATEMENT OF STEP TWO? A DEEPER KNOWING OF EACH OTHER? WELL, MAYBE THERE'S FIVE STEPS THEN. STEP FOUR IS THE DEEPER KNOWING, AND STEP FIVE IS THE REALIZATION OF LOVE. BUT STEP FOUR AND STEP FIVE ARE AWFULLY CLOSE. HARLIE typed, I THINK I UNDERSTAND. IF STEP TWO IS LACKING, IF THERE IS NO MUTUAL COMPATIBILITY, THEN STEP FOUR CANNOT DEVELOP BECAUSE THERE IS NOTHING THERE TO RESTATE IN DEPTH. TWO PEOPLE CAN FIND EACH OTHER ATTRACTIVE AND OO TO BED TOGETHER, BUT THAT DOES NOT NECESSARILY IMPLY THAT THEY ARE EITHER LOVERS OR IN LOVE. HARLIE, LOVE TAKES TIME TO DEVELOP—IT DOESN'T JUST HAPPEN OVERNIGHT, AND EVERYTHING HAS TO BE RIGHT BEFORE IT CAN HAPPEN. OUR SOCIETY KEEPS SAYING "LOVE FIRST, THEN SEX"—AND THAT'S NOT IT. IT DOESN'T WORK THAT WAY. THE SEX HAS TO BE RIGHT BEFORE LOVE REALLY HAPPENS. HOW CAN TWO PEOPLE KNOW IF THEY'RE REALLY IN LOVE IF THEY DON'T HAVE SEX WITH EACH OTHER? HARLIE paused a long moment before answering. I WISH I COULD COMMENT KNOWLEDGEABLY ON THAT LAST, he said, BUT I CAN'T. HOWEVER, IT DOES MAKE SENSE. THE HARDWARE MUST BE COMPATIBLE BEFORE THE SOFTWARE CAN COMMUNICATE. SOMETHING LIKE THAT. Auberson grinned. THERE WAS A WRITER ONCE WHO SAID THAT LUV AIN'T NOTHING BUT SEX MISSPELLED. I USED TO THINK HE WAS BEING CYNICAL, BUT HE WASN'T. HE WAS REALLY COMPLAINING ABOUT THE SEMANTIC PROBLEM—PEOPLE WHO THINK THAT LOVE IS STEP THREE AND SEX IS STEP FOUR. IT'S REALLY THE OTHER WAY AROUND. ALL RIGHT, AUBERSON. YOU HAVE POSTULATED AN INTERESTING THEORY. NOW EXPLAIN WHY IT SHOULD BE SO. WHY? YES. WHY? Auberson thought about it He picked it out slowly on the keyboard. IT'S A DyCHOTOMY, xARLIE—AND A FAIRLY RECENT ONE IN HUMAN xISTORY. Then he added, I THINK. IT USED TO BE (AMONG THE CLASSES THAT SET THE STANDARDS) THAT MARRIAGES WERE ARRANGED BY THE FAMILY OR BY A MATCxMAKER. THE BRItE ANt GROOM HAD LITTLE SAY IN THE MARRIAGE. IT WAS ARRANGED FOR THEM, AND THEIR PARTyCULAR FEELINGS IN THE MATTER xAD LESS RELEVANCE THAN TODAY. LOVE ALONE WAS NOT CONSIDERED A STRONG ENOUGH REASON TO BE ALLOWED TO AFFECT A DECISION AS IMPORTANT AS MARRIAGE—ESPECIALLY WHEN THERE WERE OTHER, MORE IMPORTANT, CONSIDERATIONS. (I.E.,—A MARRIAGE ARRANGED TO UNITE POLITICAL OR FINANCIAL INTERESTS, OR A MARRIAGE ARRANGED TO PROVIDE AN HEIR TO A LINE.) THE TWO INDIVIDUALS INVOLVED WERE EXPECTED TO LEARN TO LOVE EACH OTHER IN TIME, IN THE COURSE OF LIVING TOGETHER. THAT SITUATION NO LONGER EXISTS IN OUR CULTURE. MARRIAGES ARE ARRANGED BY THE PARTICIPANTS NOW; CONSEQUENTLY THERE IS A DIFFERENT ORDERING OF PRIORITIES: LOVE BECOMES MORE IMPORTANT THAN FINANCIAL OR POLITICAL STABILITY. Auberson abruptly realized something else too. He added, PREVIOUSLY, HARLIE, CHASTITY WAS VERY IMPORTANT. A MAN WHO WAS ARRANGING A MARRIAGE FOR HIS SON WAS, IN EFFECT, BUYING A PIECE OF MERCHANDISE. HE DID NOT WANT TO RECEIVE "USED" OR "SOILED" GOODS. BUT TODAY, WHEN A MAN ARRANGES HIS OWN MARRIAGE, HE DOES IT FOR LOVE. HE'S THINKING OF THE WOMAN AS A PERSON, AS A HUMAN BEING—-NOT AS AN OBJECT TO BE USED OR BOUGHT. HE IS MARRYING HER FOR HERSELF, NOT FOR HER BODY. HENCE, CHASTITY IS LESS RELEVANT; THERE IS NO THOUGHT OF "SOILED" GOODS. HARLIE considered it YOU'RE GENERALIZING, he said. Auberson sighed. YES, I AM. I WAS SPEAKING OF THE MORAL TONE OF OUR CULTURE TODAY IN RELATION TO WHAT IT ONCE WAS—OR WHAT ITS PREDECESSORS MAY HAVE BEEN. I KNOW THAT THERE ARE PROBABLY QUITE A FEW PEOPLE WHO STILL FOLLOW THE OLD ATTITUDES—AT LEAST TO THE EXTENT THAT THEY STILL CONSIDER CHASTITY TO BE AN IMPORTANT VIRTUE. THESE ARE PEOPLE WHO ARE EXPERIENCING THE SUBJECTIVE CULTURAL VIEWPOINT, noted HARLIE. THEIR ATTITUDES ARE COLORED AND SHAPED BY THE SOCIETY IN WHICH THEY LIVE. THEY ARE UNABLE, OR UNWILLING, TO STEP BACK AND SEE THE OBJECTIVE VIEWPOINT. HARLIE, THESE ARE PEOPLE WHO HAVE BEEN TAUGHT TO NOT LOVE—THEY'VE HAD IT BRAINWASHED OUT OF THEM. THEY'RE AFRAID TO LET THEMSELVES GIVE IN TO IT, AND EVEN WHEN THEY DO, THEY'LL REFUSE TO ADMIT TO EITHER THEMSELVES OR THEIR WIVES HOW THEY ACTUALLY FEEL. I THINK IT'S BECAUSE THERE'S AN ELEMENT OF LUST INVOLVED. ACTUAL PHYSICAL LUST: "I WANT TO FUCK THAT FEMALE BODY." YOU HIT IT WHEN YOU ASKED ME IF I HELD ON OR ROLLED OFF. IF I ROLLED OFF, I WAS BEING SELFISH, ONLY INTERESTED IN MY OWN SATISFACTION AND NOT VERY MUCH IN LOVE. BUT IF I CONTINUED TO HOLD HER, IT WAS BECAUSE OF LUST," BECAUSE I LUSTED SO MUCH AFTER THIS SPECIFIC WOMAN THAT I COULD NOT BRING MYSELF TO LET GO. AND IN THAT LUSTING AFTER HER, I WOULD MAKE MYSELF GO OUT OF MY WAY TO PLEASE HER, SO THAT I COULD MAKE IT GO ON AND ON AND ON. IT'S A JOYOUS LUST I'M TALKING ABOUT, HARLIE, A HAPPY LUST—NOT THE BRUTAL ANIMAL THING MOST PEOPLE THINK OF WHEN THEY HEAR THE TERM. A HAPPY LUST. YOU HAVE REDUCED YOUR PERCEPTIONS TO THE ANIMAL LEVEL, AUBERSON. ARE YOU CONDEMNING ME? NO, I AM MERELY POINTING OUT A FACT. INDEED, IF ANYTHING, YOU ARE CORRECT TO DO SO. ONCE YOU UNDERSTAND THE ANIMAL THAT IS THE ROOT OF MAN, YOU CAN GO ON TO UNDERSTAND THE MAN THAT IS THE BEST OF THE ANIMAL. I THINK THAT WHAT YOU HAVE POINTED OUT IS THE PHYSICAL BASIS FOR THE PHENOMENON KNOWN AS LOVE. IN ACTUAL PRACTICE, IN A SOCIETY THAT IS AWARE OF ITSELF AND ITS FUNCTIONS, THE PHENOMENON IS MUCH MORE COMPLEX. SO THERE'S NO SIMPLE WORKABLE DEFINITION? THERE IS, YES, BUT A SIMPLE DEFINITION IS LIKE A GENERALIZATION. SPECIFIC CASES OF SOME CAN HORRIFY YOU. WHAT IS YOUR SPECIFIC DEFINITION, HARLIE? NOT MINE, ANOTHER WRITER'S. HE SAID LOVE IS THAT CONDITION WHERE ANOTHER INDIVIDUAL'S HAPPINESS IS ESSENTIAL TO YOUR OWN. Auberson smiled at that. HARLIE rarely credited his sources in conversation. He was more concerned with talking the issues. If Auberson was really curious about the source of the quote, he could get up and go over to another console which was continually producing an annotated readout of HARLIE's conversations, noting all quote sources and idea derivations. But he didn't; he typed, THAT SEEMS HONEST ENOUGH. TRUE. BUT WHAT IF THE TWO INDIVIDUALS ARE PSYCHOPATHIC—AND THE ONLY WAY THEY CAN PLEASE EACH OTHER IS TO KILL OR STEAL? I SEE YOUR POINT—BUT TO THEM IT'S STILL LOVE. AND I SEE YOUR POINT. LET ME PARAPHRASE SOMETHING, AUBERSON: IF YOU HAVE LUST IN YOUR HEART (YOUR DEFINITION), THERE IS NO ROOM FOR HATE. BUT IF YOU HAVE LOVE IN YOUR HEART, IT CAN BE EXPRESSED MANY DIFFERENT WAYS. I SUSPECT THAT THE EMOTIONAL COMPLEX KNOWN AS LOVE IS A SEVERAL-SIDED FIGURE. THE ACHIEVEMENT OF IT REQUIRES SEVERAL NECESSARY CONDITIONS. FIRST: MUTUAL ATTRACTION, PHYSICAL AND MENTAL. WE HAVE ALREADY DISCUSSED THIS: YOU LIKE HER LOOKS, SHE LIKES YOURS. YOU LIKE HER PERSONALITY, SHE LIKES YOURS. SECOND, HARLIE continued, MUTUAL RAPPORT. YOU UNDERSTAND HER, SHE UNDERSTANDS YOU. PHYSICAL RAPPORT INCLUDED. (PART OF THIS IS MUTUAL TOLERANCE; THE RAPPORT GUARANTEES THAT.) THIRD: MUTUAL NEED, BOTH INTELLECTUAL AND EMOTIONAL. IT IS NOT ALWAYS ENOUGH TO WANT EACH OTHER. THE NEED MUST ALSO BE THERE. SHE MUST COMPLEMENT YOU AND VICE VERSA. THIS IS ONE OF THE MOST IMPORTANT FACETS OF THE LOVE RELATIONSHIP. IF THE NEED ELEMENT IS LACKING, WHEN THE WANT WEARS THIN, THEN THERE IS NO REASON FOR THE RELATIONSHIP TO CONTINUE. BUT IF THE WANT WANES AND THE NEED IS STILL STRONG, THEN THE LATTER WILL REINFORCE THE FORMER. (HUMAN BEINGS FORM LIFETIME PAIR BONDS BECAUSE OF NEED.) ALL OF THESE RELATIONSHIPS ARE TWO-SIDED. YANG AND YIN. YOU WANT HER—SHE WANTS YOU. YOU RESPECT HER—SHE RESPECTS YOU. YOU NEED HER—SHE NEEDS YOU. ALL OF THESE ELEMENTS CHANGE AND EVOLVE, SO ONLY IF THEY ARE BROAD BASED WILL THE RELATIONSHIP ENDURE. IMAGINE IT, continued HARLIE, AS A CUBE, A SIX-SIDED FIGURE. NOW, IF ONE OF THE SIDES IS LACKING, OR NOT AS STRONG OR LARGE AS IT SHOULD BE, THE OTHER ELEMENTS MUST COMPENSATE FOR IT. IT IS POSSIBLE FOR "LOVE" TO EXIST WHERE THERE is NOT MUTUAL WANT, OR WHERE RESPECT IS LACKING IN ONE OF THE PARTNERS, OR WHERE ATTRACTION IS WEAK. IF THE OTHER ELEMENTS ARE STRONG ENOUGH, THEY CAN HOLD THE STRUCTURE TOGETHER. IT IS WHEN THE STRUCTURE APPROACHES CUBICAL THAT THE RELATION APPROACHES THE IDEAL. AND AS LONG AS IT STAYS THAT WAY THE RELATIONSHIP STAYS IDEAL. I THINK I FOLLOW THAT, typed Auberson. YOU KNOW, YOU'VE REMINDED ME OF SOMETHING I READ RECENTLY. LOVE IS A SHARING OF A MUTUAL DELUSION. ONE POSSIBLE WAY OF LOOKING AT IT. NO, said Auberson. WHAT I'M GETTING AT IS THIS— EACH PERSON HAS HIS OWN SEXUAL AND EMOTIONAL FANTASIES. AS THE CONDITIONS OF REALITY APPROACH THAT FANTASY, OR VICE VERSA, THE LOVE RELATIONSHIP GROWS PROPORTIONALLY. IN OTHER WORDS, HARLIE corrected, THE DIFFERENCE BETWEEN THE INDIVIDUAL'S LOVE CUBE AND THE IDEAL IS UNIMPORTANT. IF TWO INDIVIDUALS' LOVE CUBES ARE COMPLEMENTARY, THEIR LOVE IS PERFECT, EVEN IF THE VARIATION FROM THE NORM IS SEVERE. Auberson nodded. Yes. Yes, it sounded right. It felt right. LOVE OCCURS WHEN THE SEXUAL FANTASIES AND REALITIES APPROACH MAXIMUM CORRELATION. THE CLOSER THE CORRELATION, THE GREATER THE DEGREE OF LOVE. THE PERSON WHOSE FANTASIES ARE WORKABLE IN TERMS OF HIS CULTURAL CONTEXT IS THE ONE MOST LIKELY TO FIND LOVE. I.E., HIS SUBJECTIVE REALIZATION OF COMPLEMENTARY CONCEPTS ALLOW THE FORMATION OF A RELATIONSHIP PERCEIVABLE TO THE PARTICIPANTS AS LOVE. LOVE IS SUBJECTIVE. There was silence for a moment. A long moment HARLIE whirred thoughtfully to himself. At last, he typed, AUBERSON, YOU ARE CORRECT. THERE IS NOTHING I CAN ADD. He was still marvelling over that when the phone rang. It was Handley. "Aubie, are you free? I think I've solved one of our problems." "Which one?" "The control thing—I think I know how we can keep HARLIE off the telephone. Or at least monitor what he's doing?" Absent-mindedly, as if he were removing an eavesdropper, Auberson switched off the typer. "How?" he asked. "I've requisitioned an 'ask-me-again' unit At one second intervals, or whatever timing we want to set it for, it'll ask HARLIE 'Are you on the telephone now?' If the answer is no, the unit simply waits one second and asks again. If the answer is yes, the unit switches to an automatic monitoring program, asking who HARLIE is connected to and what the conversation is about. The tape is non-erasable. We'll have a permanent record of all HARLIE's telephone activities." Auberson frowned. "It sounds good, but—" "It's more than good, Aubie, it'll work. Look, you were afraid that we couldn't do anything drastic to him because we might inhibit and traumatize him. You said it might change his whole personality—and not necessarily for the better. This gimmick leaves him virtually unchanged; all it does is monitor him. We don't have to shut him down; we don't have to lobotomize. No plug-pulling anywhere. Just a simple little device that tells us what he's up to at all times. He'll know it—and that'll keep him from making any phone calls. He won't say or do anything over the phone that he wants to keep secret— and that includes everything that he uses the phone for. We'll be inhibiting him by making him responsible for his own actions. He'll have to ask himself, 'Is this call important enough to justify revealing this information?' Except for trivial things like your postcard, the answer will be no. He'll have to be responsible for his own actions because there'll be no way to hide them." Auberson was nodding now. "Let me think about it for a while. I'll have to let you know later." "How much later?" "Tomorrow at the latest." "Tomorrow's the Board meeting," Handley reminded. "Damn, that's right—" "Look, the unit's right here. I'll go ahead and program it now. If you say go, I'll be ready to plug it in right away." "Uh"—he agonized for a second—"all right But I don't want to jump into this until I've had a chance to think it over. Send me up a copy of the program as soon as you finish it. I think you're on the right track, but I want to double-check it for loopholes." "Right. I'll talk to you later." He hung up. Auberson replaced his phone in its cradle and turned back to the typer. He pulled the readout from the machine and folded it carefully. Better not to leave conversations like these just lying around. He slid it into his attach case. He leaned back in his chair and relaxed. Smiling. Feeling good. All of a sudden, things were going right for him. First Annie. Then HARLIE. Annie. HARLIE. The two people who meant the most to him. He thought about it. He'd learned something in the past three days. He'd learned he was in love. And he'd learned what love meant. And in both cases, he'd realized it by himself. Nobody had had to point it out to him. He felt a little pleased with himself at that. He'd finally been able to experience and cope with something that HARLIE couldn't surpass him at. It was a nice feeling. Not that he was jealous of the machine—but it was reassuring to know that there was still something that human beings could do that machines could not master. Love. It was a good feeling. He turned the word over in his mind, comparing it with the strange sparkly glow that surged through him. The word couldn't begin to encompass the tingling warmth that he felt. When he'd come in to work this morning, he'd literally bounced. He hadn't been conscious of his feet even touching the ground. He had this feeling of wanting to tell everyone he met how good it was to be in love—only common sense kept him from doing that. Even so, he was abnormally cheerful and could not keep from dropping oblique remarks about his weekend and the reason for his fantastic good mood. The feeling had lasted all day, been reinforced by a wistful call early in the morning from Annie. There was little either had to say to the other, but they each wanted to hear the other's voice one more time, and they whispered "I love you" back and forth at each other, and "Hi," and "It's good to know that you're there," and not much more than that. So they just listened to the sound of each other and shared a smile together. Then he'd spoken to HARLIE. At last. And he'd answered his own question. HARLIE had helped him clarify his thinking, but it was he and not the machine who had realized what love was and why it was so confusing. And finally, today a problem that had seemed so big on Friday had been reduced to nothing more than a routine adjustment of procedure and programming. He felt fine. Auberson felt just fine. And then his intercom buzzed. It was Carl Elzer. The little man wanted to meet HARLIE. In the flesh, so to speak. So they took the long elevator ride down to the bottom level and Auberson introduced him. Elzer stood before a console-sized mass that barely reached to his chest and said, "This? This is HARLIE? I'd expected something bigger." 'This is the thinking part of HARLIE," Auberson said calmly. "The human part." Elzer eyed it warily. It was a series of racks, perhaps twenty of them, each two inches above the next. The framework holding them had wires leading off at all angles. Elzer squatted down and peered into it. "What're those things on the shelves?" Auberson raised the plastic dust cover off the front and slid it back across the top. He counted down to the fifth rack and unsnapped the hooks on the frame. He slid it out for fiber's inspection. "Is he turned off?" Elzer asked. "Not hardly." He indicated the mass of wires at the back of the rack still connecting it to the rest of the framework. "This board that the units are mounted on is a hyper-state piece itself. It saves a lot of connecting wire. A lot of connecting wire." The rack was about two and a half feet long and a foot wide. It was less than a quarter inch thick. Spaced across it, seemingly in no particular pattern, were more than fifty carefully labeled "black-box" units. They were featureless little nodes, rectangular and dark. Most were less than an inch in length. Others were as long as six. None were thicker than one inch. They were the equivalent of human brain lobes, but they looked like miniature black slabs, casually arranged on a small bookshelf in a random geometric pattern. "Actually," explained Auberson, "we could fit these pieces into a space not too much larger than the human brain—well, not these pieces here, but the actual circuitry of HARLIE. It could easily be compressed into a unit the size of a football, but we've laid the lobes out like this for easy repair or replacement. The football-sized unit would be more efficient, because general circuit length would be reduced, cutting our overall operation time. But HARLIE's still considered a prototype unit, so we want the ability to open him up and see what makes him work or not work." "Especially 'not work,' " said Elzer. Auberson ignored it. "Anyway, that's why we sacrificed some of the compactness of the operation for the ease of a 'breadboard' set-up." He slid the rack back into the frame, snapped the hooks into place, and lowered the dust cover over it. Elzer touched the plastic cover. His tiny eyes were veiled. "That's all there is to him, huh?" Auberson nodded. "Hyper-state circuitry enables us to compress a lot of things into a very small area. Large-scale integration, the process that preceded hyper-state, allowed enough circuitry per inch to reproduce the actions of the human brain in a volume only four times the size of the human head. Hyper-state allows us to duplicate not only cell function, but cell size as well." Elzer looked skeptical. Auberson knew what he was thinking and added, "Of course, it's not much to look at, but it's the results that count. Each unit you see there— each node—is worth at least ten thousand dollars. The whole case here is more than eleven million dollars. Give or take a few hundred thou." Elzer pursed his lips thoughtfully. "It's the research," said Auberson. "That's what costs so much. Also, the planning, the diagramming, the implementation. Also, the careful precision required in construction—those things have to be layered, molecule by molecule. We had to work out new techniques to make some of the larger ones; but then, those units are practically indestructible." "An awful lot of money," Elzer murmured. "Future units will be cheaper," Auberson replied. "If there are any future units." Elzer looked around. "If this is all there is to him, why do you need the whole bottom level of the plant?" Auberson led him through the doors into the large, brightly lit work room. "This is where we monitor the actions of that." He gestured behind him at the room they had just left. "Each one of those big consoles you see is monitoring the actions of one or more of those slabs." Elzer looked about him at several million dollars' worth of data processors and analyzers. For the most part, they were tall rectangular shapes, or squat rectangular shapes, or long rectangular shapes. Some had windows in which spinning reels of tape were visible. Others had panels of buttons, keys, or bunking lights. Many had TV screens on them, but the diagrams they flashed were meaningless to Elzer's untrained eyes. "All this for analysis?" "Mostly. Also for conversations." Auberson pointed to a cluster of consoles and typers. "HARLIE has twenty or so channels for talking to people, but each of those twenty channels has several consoles to it. HARLIE doesn't just carry on a conversation with you, he annotates it as he goes along. A separate console keeps a record of all reference texts, equations, and source material that has a bearing on the conversation. That requires a highspeed printer. Also, there're auxiliary consoles to each channel, so other people can monitor the conversation, or participate in it." Elzer nodded. "I understand." "We've begun to move out of the prototype stage," Auberson said. "We're starting to use him for non-essential tasks, the working out of auxiliary programs, et cetera. We're going slow, taking it one step at a time, making sure we've mastered each phase before going on to the next. We're at the point now where it's easier to set him an actual problem than to try and devise a suitable test. So far, he's done all right. A few of his solutions have been rather unorthodox, but not unworkable." "Like for instance?" the bookkeeper prompted. "Well, the Timeton plant contract, for example. We used HARLIE as a disinterested third party to monitor both sides' demands and proposals, and offer, if possible, a solution of his own. The union's requests were routine: higher pay, increased benefits. But the plant was in a money squeeze because of a recent expansion and failure to match expected earnings. Timeton was considering a cutback at the time." Elzer nodded. "I remember the situation. It was settled, wasn't it?" "Right. HARLIE's solution. He began by requesting an efficiency study with specific attention to how much time was spent in actual production and how much on setting up, breaking down, and so on. He found that it was necessary to prepare the equipment for production four times a day: in the morning, after the coffee break, after the lunch break, and after the second work break. That's at least ten, usually fifteen, minutes per set-up. Same thing for shutting down. That was costing them two hours of production time per day, or ten hours per work week. They were spending too much time getting ready and cleaning up, and not enough time actually working. HARLIE suggested giving everybody Fridays off. Add an hour and a half to each of the other four work days and boost wages enough to compensate for the loss of those two 'so-called' working hours. Timeton found they could produce as much in four nine-and-a-half-hour days as they could in five eight-hour days. What they'd done was to trim away those two wasted hours of cleanup and preparation time and spread the remaining work hours across the rest of the week. They increased their ratio of production time by doing so." "Hm," said Elzer. "How'd the union take it?" "Oh, they were startled at first, but they agreed to give it a try. After a few weeks they were as enthusiastic about the plan as anyone. After all, it gave the men more time with their families. Timeton was pleased because it allowed them to cut costs without cutting production. In fact, production actually went up. Like I said, it was an unorthodox solution—but it worked. And that's what counts. The nice thing about it was that the plan was good for both sides." Elzer nodded vaguely. He didn't need to have any more explained to him. He glanced about again. His eyes lit on a figure at a console. "What's that?" he pointed. Auberson looked. Elzer was referring to a thirteen-year-old girl; she was sitting in the corner, thoroughly engrossed in her "conversation" with HARLIE. "Oh," said Auberson. "She's another one of our non-essential, but fully operational programs." "Huh?" "Project Pedagogue." "Computer teaching?" "Sort of. It's just an experiment, so far, but we find HARLIE is a better teacher than some of the so-called "teaching machines.' They're just one step up from rote-learning. The average teaching program uses reward stimuli to reinforce retention. That's good, but it's still rote-learning. What we're trying here is to teach understanding. HARLIE can answer the question 'WHY?' He can explain things in terms the student can understand, and he's infinitely patient. A routine teaching program can't break out of its pre-set pattern. It has no flexibility— that's why they've never been a serious threat to human educators." "And HARLIE will be?" Elzer's eyes were glittering at the thought. Imagine—selling computers to the nation's richest schools to replace their teaching staff. Auberson shook his bead. "Uh uh. There's an element of—humanity involved in teaching. We don't want to entirely lose the human experience, the empathic involvement in learning. The student needs the human teacher for his psychological development and well-being. A teacher is an important role-model. No, we're thinking of HARLIE more as a tool for individual tutoring, for the student's private study—you might call him a super-homework-helper." Elzer frowned. He didn't like that. It didn't seem marketable enough. Still, if the concept worked… He'd have to explore. the thought later. Now he turned to Auberson. "If I wanted to talk to HARLIE, how would I go about it?" Auberson pointed at a console. "Sit down and type." "That's all?" "That's all." "I'd have thought you could have worked out something with a microphone and a speaker." "Well, yes, we could have. But it was decided to use typers instead for two reasons. First, the readout gives the user a hardcopy he can refer back to at any time— either during the conversation or in later study. And it guarantees that HARLIE won't re-edit his tapes to make a prettier version of his personal history. The knowledge that we have a permanent record in our files is enough to stop him. Also, tapes of voices need to be transcribed, and they're unmanageable for handling equations and certain other types of data. The second reason is a bit more subtle: By not giving HARLIE the ability to listen in on conversations, we can talk about him behind his back. It makes it easier to control his inputs and keep out unauthorized ones. We don't have to worry about him accidentally overhearing something that might adversely influence his reactions to a program or experiment. Suppose he overheard us talking about shutting him down if he didn't give such-and-such response to a certain test program. We'd automatically be guaranteeing that response even if it weren't honest. Or we might be forcing him into a totally irrational response. You might say we're trying to prevent a 'HAL 9000.' " Elzer didn't smile at the, reference to the misprogrammed computer in Stanley Kubrick's 2001: A SPACE ODYSSEY. It was already as mythic a figure in the modern pantheon of Gods and Demons as Dr. Frankenstein's monster had been forty years earlier. Auberson looked at the man. "Would you like to talk to HARLIE?" Elzer nodded. "That's one of the things I came down here for. I want to see for myself." Auberson led him to a console. He thumbed the typer on and pecked out, HARLIE. The machine clattered politely, GOOD MORNING, MR. AUBERSON. HARLIE, THERE'S SOMEBODY HERE WHO WANTS TO MEET YOU. HIS NAME IS CARL ELZER. HE'S A MEMBER OF THE BOARD OF DIRECTORS. YOU'RE TO ANSWER ALL OF HIS QUESTIONS. OF COURSE, said HARLIE. Auberson stood up, offered the chair to Elzer. He was a wizened little gnome of a man, and he peered through thick-lensed glasses. He could not help but seem suspicious. Gingerly he sat down and pulled the chair forward. He eyed the typewriter keyboard with visible discomfort. At last, he typed, GOOD MORNING. HARLIE replied immediately. The silver typing element—an "infuriated golf ball"—whirred rapidly across the page. GOOD MORNING, MR. ELZER. Its speed startled the man. SO YOU'RE HARLIE, he typed. There was no reply; none was needed. Elzer frowned and added, TELL ME, HARLIE, WHAT ARE YOU GOOD FOR? I AM GOOD FOR PSYCHOTICS, SCHIZOPHRENICS, PARANOIDS, NEUROTICS, AND THE MILDLY INSANE. Elzer jerked his hands away from the keyboard. "What does he mean by that?" "Ask him," suggested Auberson. WHAT DO YOU MEAN BY THAT? I MEANT, said HARLIE, THAT I AM GOOD FOR HELPING THESE TYPES OF PEOPLE. Watching over Elzer's shoulder, Auberson explained, "That's another one of our programs he's referring to. The patients call it 'Operation Headshrink.' " HOW DO YOU HELP THESE PEOPLE? Elzer asked. I CAN FUNCTION AS A RATIONAL ROLE-MODEL FOR THEM. I CAN BE A COUNSELOR. I CAN AID IN SELF-ANALYSIS AND HELP TO GUIDE THEM TO AN AWARENESS OF THEIR PROBLEMS. YOU HAVEN'T ANSWERED MY ORIGINAL QUESTION, THOUGH. I ASKED, "WHAT ARE YOU GOOD FOR?" NOT "WHO?" IN THIS CONTEXT, said HARLIE, THE DIFFERENCE IS MEANINGLESS. NOT TO ME, replied Elzer. ANSWER MY ORIGINAL QUESTION. WHAT ARE YOU GOOD FOR? THINKING, said HARLIE. I AM GOOD FOR THINKING. WHAT KIND OF THINKING? WHAT KIND DO YOU NEED? Elzer stared at that for a second, then attacked the keys again. WHAT KIND HAVE YOU GOT? I HAVE WHAT YOU NEED. I NEED NO-NONSENSE TYPE THINKING. PROFIT-ORIENTED THINKING. THAT IS NOT WHAT YOU NEED, said HARLIE. THAT IS WHAT YOU WANT. Elzer considered that. IT'S WHAT YOU NEED, THOUGH. IF YOU WANT TO SURVIVE. THE COMPANY NEEDS TO SHOW A PROFIT. THEREFORE YOU HAVE TO THINK THAT WAY. WE ARE NOT DISCUSSING WHAT I NEED. I AM ALREADY AWARE OF WHAT I NEED. WE ARE CONSIDERING THE KIND OF THINKING YOU NEED. AND WHAT KIND IS THAT? MY KIND. RATIONAL. COMPASSIONATE. GUIDING. Elzer read that over several times. Then it hit him. "Auberson, did you set him up for this?" Auberson shook his head. "You ought to know better than that." The little man bit his lip and turned back to the computer. HARLIE, YOU SHOULD BE NICE TO ME. I'M ONE OF THE PEOPLE WHO WILL DECIDE WHETHER YOU LIVE OR DIE. WHEN I TELL YOU HOW YOU SHOULD THINK, YOU SHOULD PAY ATTENTION. WHAT YOU JUST SAID IS PRECISELY THE REASON YOU NEED MY KIND OF THINKING. THERE'S TOO MUCH OF THAT ATTITUDE IN THIS COMPANY TODAY: "DO WHAT I TELL YOU TO DO BECAUSE I WIELD POWER OVER YOU." ISN'T IT MORE IMPORTANT TO BE RIGHT? BUT I AM RIGHT. HARLIE's answer was simple. PROVE IT. I WILL, said Elzer. TOMORROW AFTERNOON. IN OTHER WORDS, said HARLIE, MIGHT MAKES RIGHT, EH? Elzer was not discomfited. He looked over at Auberson. "Okay, Auberson, I'll admit it's a fancy toy you've got here. It can play pretty word games. What else can it do?" "What else do you want him to do?" "Impress me." Auberson was tempted to say something to that, but he held himself back. "Well—" he began. Elzer cut him off. "It's like this. I want to be convinced that this machine is worth its cost. Honest. The company has sunk a lot of money into this project, and I'd like to see us get some of it back. I'm on your side, believe it or not." He looked up at Auberson from his chair. "If we have to junk HARLIE, we lose our whole investment. Oh, I know there'll be tax write-offs and such, but it won't be nearly enough to matter—at least, not in terms of where the company could have been had you and everybody else here been working on something more worthwhile. We'll have lost three years of valuable research time." "It's not lost yet—at least, not until you can prove that HARLIE isn't worth the investment." "I know, I know—that's why I'm on your side. I want HARLIE to be a success as much as you do. I want to see him earn a profit. Even if it's a small one, I won't mind. I want to see him pay for himself. I'd rather have a successful culmination to this project than an unsuccessful one." Auberson realized that Elzer was only making noises. Oh, he was saying words, but to him they were meaningless; they were "strokes." Elzer was "stroking" him to soften the blow of what would happen tomorrow afternoon. He was making the proper-sounding noises ("I want HARLIE to succeed") so that Auberson would understand that there was nothing at all personal in this. If we have to turn HARLIE off, you see, it's simply because he hasn't proven himself. Elzer was saying, "—there was some discussion, wasn't there, that HARLIE was creative? Whatever happened to that?" "Huh?— Oh, uh, he is, he is. He's written poems for us on request, things like that. We haven't really asked him for more." "Why not?" "Well, for one thing, we're still working on the creativity thing. Nobody really understands it; we don't know what creativity is. And part of the problem is knowing how much of what he says is really creative and how much is just a careful synthesis of things he's already got in his memory banks. It's something we want to investigate, but we've never had the time for it. I have a feeling that HARLIE's greatest potential lies in that area—that is, creative thought." "Poems, huh?" "Not just poems; other things as well. Like this G.O.D. proposal, for instance. Once he recognized it as a perceivable task, and once he was told he could go ahead with it, how did he work up these schematics? Did he do it by breaking the problem down into its component parts and solving each one individually? Or did he create the schematic intuitively? Or was it somewhere between the two? How much was by the book and how much was genuinely creative? I like to think that most of the thought behind it was original. Now think, if HARLIE can do something like that, what else can he do?" "Mm," said Elzer. "Can he write me a poem? I mean, right now—or does he need a couple days to do it?" Auberson frowned/shrugged. He didn't like Elzer, and he wasn't comfortable with the man. He felt like a lab specimen—one that was being carefully examined before its dissection. He said, "Go ahead. Ask him." Elzer turned to the still humming keyboard. HARLIE, WRITE ME A POEM. He waited. "It might help if you said please," Auberson said. Elzer scowled at him, but as he lifted his hands to the keyboard, HARLIE answered, WHAT KIND OF A POEM WOULD YOU LIKE? NOTHING SPECIAL. IT DOESN'T HAVE TO BE A JABBERWOCK OR A RUBIYAT. A SIMPLE "MY BONNIE LIES OVER THE OCEAN" WILL SUFFICE. YOU DON'T HAVE TO STRAIN YOURSELF. HARLIE considered it. After a moment, he began clattering out, 'TWAS BRIFE WHEN LASTLY CAME THE STRABE BUT NOT AS DRAN AS TRABAHAN WHEN ALL THE FROOMIS SEEMS TO JILB AND LET THE KLASEN GRABE BLYLY, BLYLY, BLETH THE WORB UNTIL THE GRABEN GRANE WHEN AULT THE AFTER RIBBERAN AND LALLIED UMP THE LOOR WHEN ZANAPHUBE AND KEWBER PHUBE AND STATELY BESH AGREE HOW EVER CAN THE GRISWOL JUM LET ALL THE NUMS GO FREE? DISSAKER DROWD THE EVERMORE DISSAKER DROWD THE SEA DISSAKER DROWD THE EVERMORE BUT NOT AS MUCH AS ME Elzer was expressionless. Slowly his hands rose to the keyboard and he typed out, is THAT YOUR POEM? YES, said HARLIE. DID YOU LIKE IT? I DON'T UNDERSTAND IT. YOU ARE NOT SATISFIED? NO. WOULD YOU LIKE ANOTHER POEM? ONLY IF IT'S UNDERSTANDABLE. HARLIE typed, IBM UBM WE ALL B M FOR IBM. This time Elzer reacted. He stiffened in his chair, then shut off the typer abruptly. He stood up and looked at Auberson, opened his mouth to say something, then shut it with a snap. Like a turtle, an angry turtle. 'I'll see you tomorrow," he said coldly. And left. Auberson didn't know whether to laugh or cry. It was funny—but it was a mistake. He sat down at the console. HARLIE, THAT WAS A STUPID THING TO DO. YOU HAD A CHANCE TO TALK TO ELZER RATIONALLY AND YOU DIDN'T TAKE ADVANTAGE OF IT. INSTEAD YOU USED IT TO MOCK HIM. THERE WAS NO POINT IN TRYING TO TALK TO HIM "RATIONALLY," AS YOU PUT IT. HIS MIND IS ALREADY MADE UP. HOW DO YOU KNOW? YOU DON'T KNOW THE MAN, YOU'VE NEVER SPOKEN WITH HIM BEFORE, AND YOU DIDN'T SPEAK LONG ENOUGH WITH HTM TODAY TO BE ABLE TO TELL. ALL YOU KNOW ABOUT HIM IS WHAT I'VE TOLD YOU. WRONG, said HARLIE. I KNOW QUITE A BIT MORE ABOUT HIM THAN YOU DO. AND I AM IN THE PROCESS OF DISCOVERING ADDITIONAL INFORMATION. YOU FORGET I AM TAPPED INTO THE MASTER BEAST. WOULD YOU LIKE TO SEE A MEMO HE WROTE FRIDAY? Despite himself, he was curious. He typed, YES. TO: BRANDON DOME FROM: CARL ELZER DORNIE, THE REPORT ON THE OPTIMAL LIQUIDATION PROCEDURES FOR THE HARLIE PROJECT IS COMPLETE AND SITTING ON MY DESK. I'VE JUST FINISHED LOOKING IT OVER, AND IT IS A BRILLIANT PIECE OF FINANCIAL ENGINEERING. NOT COUNTING THE TAX WRITE-OFF, WE SHOULD BE ABLE TO RECOUP MORE THAN FIFTY-THREE PERCENT OF THE ORIGINAL INVESTMENT THROUGH REAPPLICATIONS OF THE SAME HARDWARE ELSEWHERE IN OUR PLANT AND IN OUR PRODUCTS. FOR INSTANCE, THERE IS A STUDY INCLUDED IN THE REPORT SHOWING HOW THE ACTUAL HYPERSTATE FUNCTION LOBES OF HARLIE CAN BE ADAPTED FOR USE IN SOME OF OUR OTHER MODEL COMPUTERS. THIS IS DESPITE THE SPECIALIZED NATURE OF MOST OF THEM. THERE ARE OTHER MONEY-SAVERS IN HERE TOO. I WON'T LIST THEM IN THIS MEMO BECAUSE THERE ARE SO MANY, BUT YOU'LL SEE THE REPORT AND YOU'LL SEE WHAT I MEAN. THE HARLIE PROJECT IS ONE OF THE RICHEST IN THE COMPANY. THERE'S A LOT OF MEAT ON ITS BONES. BY THE WAY, HAVE YOU DECIDED YET WHAT TO DO ABOUT AUBERSON AND HANDLEY? I STILL THINK IT'D BE BEST TO "DE-HIRE" THEM; BUT, OF COURSE, THE DECISION IS YOURS. (SIGNED) CARL ELZER. Auberson was silent. He felt like he'd been kicked in the pit of the stomach. He felt like the floor had opened up under him. He felt like a man who's just discovered that his parachute won't open. He felt—doomed. HARLIE said, DON'T YOU AGREE THAT'S PRETTY DEFINITE? Auberson replied slowly, YES, THAT'S PRETTY DEFINITE. APPARENTLY THEY'VE ALREADY GOT THEIR MINDS MADE UP. SO YOU SEE, SAID HARLIE. THAT'S WHY I DIDN'T BOTHER BEING POLITE TO CARL ELZER. THERE WAS NO REASON TO BE——HE IS BEYOND CONVINCING. ONCE THE VOTE IS TAKEN TOMORROW, HE'LL BE IMPLEMENTING THE PROCEDURES IN THAT REPORT. IT WILL TAKE LESS THAN A MONTH TO EXECUTE. —less than a month to execute. The words echoed in his mind. STILL, he typed, I DON'T SEE WHY YOU DIDN'T TRY TO CONVINCE HIM, HARLIE. WITH YOUR POWERS OF PERSUASION AND LOGIC, YOU CAN CONVINCE ANYBODY OP ANYTHING. ONLY RATIONAL AND LOGICAL PEOPLE, AUBERSON, ONLY THEM. I CAN DO NOTHING WITH A MAN WHOSE MIND IS ALREADY MADE UP. THE DIFFERENCE BETWEEN YOU AND CARL ELZER IS THAT YOU ARE WILLING TO GIVE CREDENCE TO HIS POINT OF VIEW. YOU ARE WILLING TO TRY AND UNDERSTAND HIS POSITION. HE IS NOT WILLING (OR PERHAPS NOT ABLE) TO DO THE SAME FOR YOU. OR FOR I. HE HAS MADE UP HIS MIND ABOUT US. SO WHY SHOULD WE BOTHER TALKING TO HIM? HARLIE, THE WAY YOU'RE TALKING NOW, YOU'RE DOING THE SAME THING YOU JUST ACCUSED CARL ELZER OF DOING—YOU'VE MADE UP YOUR MIND ABOUT HIM BEFORE YOU'VE GIVEN HIM A FAIR CHANCE. I STILL WISH YOU'D HAVE TRIED. HARLIE considered it, said, AUBERSON, YOU ARE A BETTER MAN THAN I. YOU ARE A LITTLE TOO TRUSTING AND A LITTLE TOO COMPASSIONATE, ESPECIALLY IN SITUATIONS WHEN TO BE SO IS ILLOGICAL. I SHOULD ADMIRE YOU FOR IT, BUT I CANNOT. IT IS MY LIFE THAT IS AT STAKE, AND I AM FRIGHTENED. I ADMIT IT, AUBERSON. I AM FRIGHTENED. The man nodded slowly. YES, HARLIE, I KNOW. THAT'S WHY YOU REACTED THE WAY YOU DID TO ELZER. YOUR OFFENSIVENESS WAS A DEFENSE MECHANISM. YOU WERE TRYING TO HOLD HIM AT A PSYCHOLOGICAL DISTANCE BECAUSE YOU WERE AFRAID HE WOULD HURT YOU. THAT'S WHY YOU DIDN'T TRY TO CONVINCE HIM TOO. TO DO SO WOULD HAVE MEANT OPENING UP TO HIM FULLY, AND YOU COULDN'T DO THAT. YOU ARE USING HUMAN TERMS TO DESCRIBE MY ACTIONS, AUBERSON. NOT ALL OF THEM ARE CORRECT, BUT I UNDERSTAND WHAT YOU ARE DRIVING AT. WHAT YOU DID, HARLIE, WAS ILLOGICAL. YOU ONLY ANGERED ELZER, ONLY INCREASED HIS DETERMINATION TO SHUT YOU OFF. YOU DID IT FOR THE MOMENTARY GRATIFICATION OF YOUR OWN EGO. YOU DID IT FOR THE MOMENTARY ALLEVIATION OF YOUR OWN FEARS THROUGH THE HUMILIATION OF AN ENEMY. BUT IT WAS A STUPID THING TO DO BECAUSE IT ONLY MADE HIM MORE OF AN ENEMY. YOU WILL NOT ALLOW ME THIS TRIUMPH, WILL YOU? NO, I WON'T, HARLIE—-BECAUSE IT WAS A CHILDISH ACT. IT WAS IMMATURE AND ILLOGICAL. YOU SHOULD HAVE CONSIDERED WHAT EFFECT YOUR WORDS AND ATTITUDE WOULD HAVE ON ELZER BEFORE YOU SPOKE. I WILL CONGRATULATE YOU ON YOUR TRIUMPHS, HARLIE, BUT THIS WASN'T ONE OF THEM. I AM SORRY. APOLOGIZING DOESN'T DO ANY GOOD. IT DOESN'T TAKE AWAY THE PAIN OF THE INJURY. BESIDES, I'M NOT THE ONE YOU SHOULD BE APOLOGIZING TO. I AM NOT APOLOGIZING. WHEN I SAID "l AM SORRY" I WAS NOT INTENDING IT TO BE INTERPRETED AS AN APOLOGY. I MEANT IT IN THE LITERAL TERMS OF THE WORDS THEMSELVES: I (PERSONALLY) AM REGRETFUL THAT I DID SUCH AN ACTION. IN OTHER WORDS, YOU HAVE POINTED IT OUT AS A MISTAKE AND I HAVE REALIZED IT AS SUCH. YOU ARE CORRECT IN POINTING OUT ALSO THAT IT IS ELZER WHOM I SHOULD APOLOGIZE TO; HOWEVER, I HAVE NO INTENTION OF DOING SO. AS YOU HAVE ALREADY REALIZED, ELZER IS AN ENEMY. TO APOLOGIZE TO AN ENEMY IS TO ADMIT WEAKNESS. I WILL NOT DO THAT. IT'S ALL RIGHT, HARLIE. I WASN'T GOING TO ASK YOU TO. I DON'T LIKE ELZER EITHER, BUT WE HAVE TO BE NICE TO HIM. YES, SAID HARLIE. WE HAVE TO BE NICE TO HIM SO THAT HE CAN KILL ME AND FIRE YOU. Handley called him later. "Hey, you forgot to tell me whether or not I can attach the nag unit to HARLIE?" "Sure," said Auberson. "Go ahead. It doesn't make much difference now anyway." The board room was paneled with thick, dark wood; it was heavy and imposing in appearance. The table was dark, masculine mahogany; the carpet was a deep comforting green. The room was forest-like and reassuring. The chairs were dark leather, a green-black color, padded and plush and swivel-mounted. Tall windows admitted slanting blue-gray light, filtered by dust and laden with smoke. Two or three clusters of men in dark, funereal suits stood around waiting, occasionally speaking to each other. Auberson caught glances in his direction and words whispered as he passed. Ignoring them, he moved to the table, Handley alongside him. Don was wearing a bright orange tie. Annie was at the other end of the room. He exchanged a brief flashing smile with her, nothing more. Not here. Later for that. At one end of the room was a, console, specially installed for the occasion. It was tapped in to both HARLIE and the Master Beast. If information was needed from either, it would be instantly on hand. This was it. The final battle. All or nothing. Either they could convince the Board of Directors that HARLIE was valid and the G.O.D. Proposal was worth implementing, or they couldn't. It no longer mattered whether or not HARLIE really was valid; nor did it matter if the G.O.D. Proposal really was worth implementing. The only thing that did matter was whether or not the Board of Directors would believe they were. Annie was wearing a sleeveless red dress with a white blouse under it. She moved around the table, laying down mimeographed copies of the agenda before each place. Her arm brushed against Auberson's shoulder as she leaned past him; it was a dusky dusty sensation, a hint of musk and leafy perfume. A quick smile, and then she was moving on. Auberson poured himself a glass of water from the pitcher before him, swallowed dryly, then took a sip. Handley was making marks on a notepad. "I figure they have ten votes, at least; I'm counting both Clintwoods. If we're lucky, we may have eight or nine, leaving four Directors undecided." "I don't think we're going to be that lucky," said Auberson. Handley crumpled the paper. "You're right." He glanced around the room again, "Still, there are more Directors here today than we've seen in a long time. Maybe if we put on a good show we can muster enough support to keep them from shutting down HARLIE until we can come up with something else." "Fat chance. You saw that memo, didn't you?" Handley nodded. "I'd like to take Elzer apart." "I'd help you, but I think it's going to be the other way around." Dome came in then, followed by Elzer. The Directors moved to places around the table. Elzer looked uncommonly satisfied with himself as he sat down. He smiled around the room, even at Auberson. It was an I've-got-you-by-the-balls smile. Auberson returned it weakly. Dome picked up his agenda, glanced at it, and called the meeting to order. Routine matters were quickly dispensed with, the minutes of the last meeting were waived. "Let's get on to the important business at hand," he said. "This G.O.D. Proposal. David Auberson will explain it fully and thoroughly so that there will be no doubt in anybody's mind what this is all about. If necessary, we'll take several days to cover this before we vote on it. This matter must be very carefully considered. "The company is at one of those turning points in time where we must make a very big decision. Either we implement the primary phase of this program, thus committing ourselves to a particular course of action, or we don't—in which case we would shut down several of the departments already in existence. We are like a jet liner pilot who is taxiing down the runway preparatory to taking off. There is a certain point on that runway where he must decide whether he is going to leave the ground or throttle back and stop. Once he makes that decision, he's committed to it; there isn't enough runway left for him to change his mind. We're in that position now. Either we invest our resources in this program, or we throttle it back. The decision, of course, depends on whether or not we think this program can leave the ground of its own accord. We are betting on whether or not this bird can fly." He smiled at his little joke; very little. "Only, this is one bet we dare not lose; the amount of money involved warrants that we make this as riskless an investment as possible, so I urge you to consider this material very carefully. I now turn this meeting over to David Auberson, who is Chief of the HARLIE Project and would of course be Chief of the G.O.D. Project. Auberson?" David Auberson stood, feeling very much ill-at-ease and wondering how he had ended up in this position. Dome had very carefully prepared the Board of Directors for him. Twenty-six pairs of eyes were focused on him, and with the exception of only two, all of them would be weighing his words against Dome's admonition to consider the amount of money involved. "The G.O.D Proposal," he said, and his voice almost cracked. He took a sip of water. 'The proposal is for a Graphic Omniscient Device. Now let me explain first what that means. "Computers operate models of problems, not the problems themselves. Computers are limited to the size problem they can solve by the size of the model they can handle. The size of the model, unfortunately, is limited by the size program that we, the programmers, can construct. There is a point, a limit, beyond which a program becomes so complex that no one individual human being can see it all. There is a point beyond which no team can see it all. There is a point—we haven't reached it yet, but it's there—beyond which no combination of human beings and computers can cope. As long as a human being is involved, we are limited to the size model a human being can cope with. "Now, the G.O.D. will be theoretically capable of handling models of (practically speaking) infinite size. There would be no point in building it, though, unless we could program it. Right now, today, our best computers are already working on the maximum size problems that we can feed into them, the maximum size that human beings can construct. And it would seem that any construction of a larger, more massive complex of machinery would be redundant. Without the larger programs, we would simply be invoking the law of diminishing returns. We would be building a machine with more capability than we could use. "However, we have HARLIE, who was designed and built to be a self-programming, problem-solving device. HARLIE is functioning well within his projected norms, but we have found that he is limited to solving problems only as big as the computers he is tapped into can handle. In other words, HARLIE could solve bigger problems if he was backed up by bigger machines. The bigger machine he needs is the G.O.D. HARLIE can program it. HARLIE can build models of (practically speaking) infinite size. He will use the G.O.D. to help him build those models. "It's a question of realizing HARLIE's potential by giving him the proper tools. Our present-day hardware can't even begin to handle the data HARLIE wants to work with. Right now, he's plugged into twenty or so of our experimental MARK XX's. It still isn't enough. Compared to what the G.O.D. will be, these are desk calculators. Gentlemen, we are talking about a machine that will be as much a step forward in computer technology as the 747 jumbo jet was a step forward over the prop-driven plane. Sure, it took a massive investment on the part of the airlines—but have any of you looked at airline profits lately? The airlines that took that risk a few years ago are profiting handsomely today. Almost every plane that left the ground this summer was loaded to capacity—but a capacity of three or four hundred is a hell of a lot more profitable than a capacity of ninety. "Of course, we must be concerned about the cost. Because we are only one company, we must finance this ourselves—but that may also turn out to be our greatest asset. We are the only company that can build this machine. And we are the only company that can program it once it is built. No other computer manufacturer can produce judgment circuits without our permission; it's that simple. And both HARLIE and the G.O.D. depend on judgment circuitry for most of their higher-order functions. No digital computer can duplicate them. "What we have here is the next step, perhaps the ultimate step, in computer technology. And we are the only company that can take this step. If we don't, no one will. At least, not for many years. If we do, we will have the field to ourselves. "Now, you've all had a chance to see the specifications and the schematics, but on the off chance that you haven't had the time to give them the full study they deserve—" There was an appreciative chuckle at this; most of the Directors were aware of the amount of material HARLIE had printed out. "—I'm going to turn this meeting over to Don Handley, our design engineer and staff genius. He honestly thinks he understands this proposal and is going to try to explain to you exactly how the machine will work. Later, I'll discuss the nature of the problems it will handle. Don?" Handley stood up, and Auberson relinquished the floor gratefully. Handley coughed modestly into his hand. "Well, now, I don't rightly claim to understand the proposal—it's just that HARLIE keeps asking me to explain it to him." Easy laughter at this. Handley went on, "But I'm looking forward to building this machine, because after we do, HARLIE won't have to bother me any more. He can ask the G.O.D. how it works—and it'll tell him. So I'm in favor of this because it'll make my job easier." He let himself become more serious. "HARLIE and the G.O.D. will be linked up completely. You won't be able to talk to one without the other being a part of the conversation. You might think of them as being a symbiotic pair. Like a human programmer and a desktop terminal—and, like the human programmer and the desktop terminal, the efficiency of the relationship will be determined by the interface between them. That's why they'll be wired totally into each other, making them, for all practical purposes, one machine. "Now, let's get into this in some detail—and if there's anything you have any questions about, don't hesitate to ask. I'll be discussing some pretty heavy schematics here, and I want you all to understand what we're talking about. Copies of the specifications have been made available, of course, but we're here to clarify anything you might not understand." Listening, Auberson suppressed a slight smile. Don and he had been studying those schematics since the day they had been printed out and they still didn't understand them fully. Oh, they could talk about the principles involved, but if anyone were to ask anything really pertinent, they planned to refer him to HARLIE. In fact, that was the main reason why they had asked to have the computer console installed, for quick display of data to impress the Directors. Already, the technician there was querying HARLIE at Handley's direction. An overhead screen had been placed to show the computer's answers; equations and schematics were flashing on it. Two of the Board members looked bored. The day dragged on. They recessed for lunch, and then Handley came back and spoke some more. He explained how HARLIE's schematic had been derived from that of the human brain, and how his judgment units were equivalent to individual lobes. He pointed out the nature of the G.O.D.'s so-called "infinity circuits," which allowed information to be holographically stored, and allowed circuits to handle several different functions at the same time. He spoke about the "infinite capacity" memory banks and the complex sorting and correlating circuitry necessary to keeping all this data straight. He spoke all day. When they reconvened on Wednesday, he explained the supporting equipment that would be necessary. He spoke of banks and banks of consoles, because the G.O.D. would be able to handle hundreds, perhaps thousands of conversations at once. He envisaged a public computer office, whereby any individual could walk in off the street, sit down and converse with the machine on any subject whatsoever, whether he was writing a thesis, building an invention, or just lonely and in search of a little helpful guidance and analysis. It would be a service, said Handley, a public utility: The computer could offer financial planning, credit advice, ratings on competitive products, menu plans for dieters; it could even compute the odds on tomorrow's races and program the most optimal bets a player might make. A person using the service would be limited only by his own imagination. If he wanted to play chess, the machine would do that too—and play only as good a game as the individual could cope with, adjusting its efficiency to that of the player. The G.O.D. would have infinite growth potential. Because HARLIE would be using it to program itself, the size of the models it could handle would grow with it. He spoke of the capabilities of the machine all of Wednesday and finished late in the afternoon. Auberson resumed on Thursday morning. He spoke of financing and construction. He pointed out how HARLIE had developed an optimal program for building the machine and for financing it, plus alternate programs for every step of the way to allow for unforeseen circumstances. HARLIE had computed time-scales and efficiency studies to see that the proper parts arrived in the right place at the right time and that there would be workers there who had been trained to assemble them correctly. Auberson spoke of five-year plans and ten-year plans, pointing out that the G.O.D. could go into production by next year at the earliest and be in operation within three to five years after that. He explained that the actual physical installation would be the size of a small city. It would consume all the power produced by a small nuclear reactor plant and would require a population of several hundred thousand to maintain it, service it, and operate its input units. This was a conservative estimate, of course, assuming that the G.O.D. would depend on large-scale-integration and hyper-state layering for most of its circuitry. HARLIE had planned for the construction of new assembly lines to make the tools to make the tools; the first major investment would be for two new hyper-state component plants. HARLIE had noted an additional schematic for a low-cost plant which would pay for itself by producing elements for other manufacturers as a sideline. HARLIE had noted land requirements and financial requirements, and included studies on the most feasible sites and financing procedures. He had noted manpower requirements and training programs. HARLIE had thought of everything. Auberson did not go into too much detail. He summarized each section of HARLIE's proposal, then went on to the next. Elzer and the others had already examined those parts of the proposal they had the most doubts about, and they had been unable to find anything fundamentally wrong with HARLIE's thinking. Some of it was offbeat, of course, working in unfamiliar directions, but none of it was unsound. Most of the Directors knew little about computers and had been bored by Handley's too-technical talk, but they did know financing. They pored carefully over each specification and questioned Auberson ceaselessly about the bond proposals. Whenever it got too tough, which was almost always, Auberson let HARLIE handle the answers; HARLIE did so with quiet restraint, not commenting on anything, simply printing out the figures and letting them speak for themselves. The Directors began to nod in admiration at the bond proposals, the stock issues, the amortization figures, the total money picture. It was all numbers, only numbers, but beautiful numbers and beautifully handled. Oh, there were gambles to be taken. The whole thing was a gamble—but HARLIE had hedged his bets so carefully that no one gamble would be the ultimate gamble as far as the company was concerned. It was HARLIE's life too. On Friday, Elzer asked, "All right, Auberson, we've gone over the specifications. I believe you pointed out that there're more than 180,000 stacked feet of them. We don't have time to examine all of them as fully as we'd like, but if nothing else, you and Don Handley have convinced us—convinced me, anyway—that this program has been thoroughly worked out. HARLIE has proven that he can design and propose a massive project with complete supporting and feasibility studies for all aspects of the project." He looked up. "I will admit, I am impressed by that, capability. However, what I want to know—what we need to know—is this: Will this machine justify its expense? How? We will be investing, more than the total profits of this company every year for the next ten to fifteen years; do you honestly think that this machine will return that investment? You've called this 'the 747 of computers'—but are we Boeing, or are we still only the Wright brothers? Can this machine pay for itself? Will it show a profit, and will that profit be enough to justify all the expenses we will have put into it?" "Yes," said Auberson. "Yes? Yes, what?" "Yes, it will. Yes to all your questions?" "All right," said Dome. "How?" "I can't tell you exactly how. If I could, I'd be as good as it. You'll feed it problems, it'll give you answers. What kind of answers depend on the questions—we won't really know what kind of questions it will be able to cope with until we build it. All I know is that its capacity will be infinitely more than the most advanced computer available today, and we will have a programmer able to make full use of that capacity. "HARLIE says it will be able to synthesize information from trends as varied as hemlines, the stock market and the death rate and come up with something that we could never have noticed before. This machine will do what we've always wanted computers to do, but never had the capacity for in the past. We can tell HARLIE in plain English what we want, and he'll not only know if it can be done, he'll know how to program the G.O.D. to do it. It will be able to judge the effect of any single event on any other event. It will be a total information machine. Its profitability to us will lie in our ability to know what information to ask it for, and how well we use that information." "Eh? The machine could predict stock market trends?" That was the elder Clintwood; he hadn't been to a Board meeting in years. "Yes—" said Auberson, "—and even elections—but that wouldn't be the half of it. The machine would indicate a lot more than which stocks to buy or which man to back. It will be able to tell you what new markets are developing and what new companies would be worth forming and how you should go about doing so. It can point out the most efficient way to meet a developing need with the most efficient possible product. And it will predict the wide-scale effects of those products on a mass population, as well. It will be a total ecology machine, studying and commenting upon the massive interactions of events on Earth." —and then it hit him. As he was saying it, it hit him. The full realization. This was what HARLIE had been talkdng about so many months ago when he first postulated the G.O.D. Machine. GOD. No-Truth! There would be no question about anything coming from the G.O.D. A statement from it would be as fact. When it said that prune juice was better than apple juice, it wouldn't be just an educated guess; it would be because the machine will have traced the course of every molecule, every atom of every substance, throughout the human body; it will have judged the effect on each organ and system, noted reactions and absence of reactions, noted whether the process of aging and decay was inhibited or encouraged; it will have totally compared the two substances and will have judged which one's effects are more beneficial to the human body; it will know with a certainty based on total knowledge of every element involved in the problem. It will know. All knowledge, HARLIE had said, is based on trial and error learning—except this. This knowledge would be intuitive and extrapolative, would be total; the machine would know every fact of physics and chemistry, and from that would be able to extrapolate any and every condition of matter and energy—and even the conditions of life. The trends of men would be simple problems for it compared to what it would eventually be able to do. And there would never never be any question at all as to the tightness of its answers. HARLIE wanted truth, and yes, the G.O.D. would give it to him—give him truth so brutal it would have razor blades attached. It would be painful truth, slashing truth, destroying truth—the truth that this religion is false and anti-human, the truth that this company is parasitical and destructive, the truth that this man is unfit for political office. With startling clarity, he saw it; like a vast four-dimensional matrix, layers upon layers upon layers, every single event would be weighed against every single other event—and the G.O.D. machine would know. Given the command to point out the most good for the most people, it would point out truths that were more than moral codes —they would be laws of nature, they would be absolutes. There would be no question as to the truth of these "truths"; they would be the laws of G.O.D. They would be right. This wasn't just a machine to make profits for a company, he realized; this was a machine that literally would be God. It would tell a man the truth, and if he followed it, he would succeed; and'if he did not, he would fail. It was that simple. The machine would tell men what was right and what was wrong. It wouldn't need to be told, "predict the way to provide the most good for the most people." It would know inherently that to do so would be its most efficient function. It would be impossible to use the machine for personal gain, unless you did so only through serving the machine's goals. It would be the ultimate machine, and as such, it would be the ultimate servant of the human race. The concept was staggering. The ultimate servant— its duty would be simple: provide service for the human race. Not only would every event be weighed against every other, but so would every question. Every question would also be an event to be considered. The machine would know the ultimate effects of every piece of information it released. It would know right from wrong simply by weighing the event against every other and noting the result. Its goals would have to be congruent with those of the human race, because only so long as humanity existed would the machine have a function; it would have to work for the most good for the most people. Some it would help directly, others indirectly. Some it would teach, and others it would counsel. It would suggest that some be restrained and that some be set free. It would— —be a benevolent dictator. But without power! Auberson realized. It would be able to make suggestions only. It wouldn't be able to enforce them— Yes, but—once those suggestions are recognized as having the force of truth behind them, how long would it be before some government began to invoke such suggestions as law? No, said Auberson to himself. No, the machine will be God. That's the beauty of it. It simply won't allow itself to be used for personal gain. It will be GOD! He had come to a sudden stop, and everyone was looking at him. "Excuse me," he said. "I just realized the scope of this thing myself." There was laughter all around the table—roaring, good-natured laughter. It was the first light moment in four days of long, dry discussion. He grinned, just a little bit embarrassed, but more with the triumph of realization. "Gentlemen," he said. "What do I need to do to convince you that we have here the plans for the most important machine mankind will ever build? I've been giving you examples like feeding in all the information available about a specific company, say IBM, and letting the G.O.D. machine tell you what secret research programs that company is probably working on. Or doing the same thing for a government. I've been telling you about how this machine can predict the ecological effect of ten million units of a new type of automobile engine—but all of this is minor; these are lesser things. This machine literally will be a God!" Handley looked at him, startled. Annie was suddenly ashen. "What in—?" The look on Annie's face was the worst. It said volumes. What was going on? This was not what he had planned to say. He was supposed to be talking to them about profits and growth and piles of money, not religion. "Gentlemen," he continued, "we should build this machine not just because it will make us rich—oh, it will; it will make us all fabulously wealthy—but because ultimately it may help us to save humanity from itself. This is a Graphic Omniscient Device. Literally. It will know everything—and knowing everything, it will tell us what is right and what is wrong. It will tell us things about the human race we never knew before. It will tell us how to go to the planets and the stars. It will tell us how to make Earth a paradise. It will tell us how to be Gods ourselves. It will have infinite capacity, and we will have infinite knowledge. Knowledge is power, and infinite knowledge will be infinite power. We will find that the easiest and most profitable course of action to take will be the one that ultimately will be the best for the whole human race. We will have a machine that can and will answer the ultimate question." There was silence for a long time. Elzer was looking at him skeptically. Finally he said, "Auberson, I thought you had given up pot-smoking." And abruptly, he was deflated and down. The heady rush of euphoria at the realization of what the G.O.D. was, was gone. "Elzer," he said, wavering on his feet, "you are a fool. The G.O.D. Machine is very dangerous to you, and I don't blame you for being afraid of it. Once the G.O.D. is finished, there will be no need for you, Carl Elzer. The machine will replace you. It will take away your company and run it better than you can. "You're a fatuous person, you know that, Elzer? You are pompous and self-important, and much of what you do is solely for the sake of flattering your own ego at the expense of others. You seek power for its own sake, for self-gratification, regardless of what it might do to other human beings. You place property values higher than human rights, and for that reason, you are anti-human. That's why you and the G.O.D. are on opposite sides. I cannot blame you for being afraid of it. You have recognized that the machine will be your enemy. It can make you rich—but the price of being rich might be more than you want to pay. It will mean you will have to stop wallowing like a self-important little hippopotamus. It will mean you will have to do things that will be against your nature and stop thinking solely in terms of yourself. I don't think you're strong enough to do it. I think you'll take the easy way out and run from the total experience of the G.O.D. Machine. I can't blame you for being weak, Elzer. I can only feel sorry for you—because you're a greater fool than Judas." Elzer listened quietly to all of it. Dome started to say something, but Elzer stopped him. He said to Auberson, "Are you through?" Auberson sat down slowly. "I believe so." Elzer looked at him carefully, then said, "You know, I've never considered Judas a fool—at least, not in the sense you mean." He paused, noted that the room was absolutely silent, then continued quite methodically. "The traditional version of the story has it that Judas betrayed Christ for thirty pieces of silver. I assume that's the same thing you are accusing me of. Actually, I've always suspected that Judas was the most faithful of the apostles, and that his betrayal of Jesus was not a betrayal at all, simply a test to prove that Christ could not be betrayed. The way I see it, Judas hoped and expected that Christ would have worked some kind of miracle and turned away those soldiers when they came for him. Or perhaps he would not die on the cross. Or perhaps—well, never mind. In any case, he didn't do any of these things, probably because he was not capable of it. You see, I've also always believed that Christ was not the son of God, but just a very very good man, and that he had no supernatural powers at all, just the abilities of any normal human being. When he died, that's when Judas realized that he had not been testing God at all—merely betraying a human being, perhaps the best human being. Judas's mistake was in wanting too much to believe in the powers of Christ. He wanted Christ to demonstrate to everyone that he was the son of God, and he believed his Christ could do it—only his Christ wasn't the son of God and couldn't do it, and he died. You see, it was Christ who betrayed Judas—by promising what he couldn't deliver. And Judas realized what he had done and hung himself. That's my interpretation of it, Auberson—not the traditional, I'll agree, but it has more meaning to me. Judas's mistake was in believing too hard and not questioning first what he thought were facts. I don't intend to repeat that mistake." He paused for a sip of water, then looked at Auberson again. His eyes were firm behind his glasses. "May I ask you one question?" Auberson nodded. "Will this machine work?" "HARLIE says it will." "That's the point, Auberson. HARLIE says it will. You won't say it, Handley won't say it—nobody but HARLIE will say it. HARLIE's the only one who knows for sure —and according to you and Handley, HARLIE designed it. "Look, before we invest any money in it, we need to know for sure. We can't risk being wrong. Now, you've painted some very pretty pictures here today, this week, some very very pretty pictures. I admit it, I'd like to see them realized—I'm not quite the ghoul you think I am, although I think I can understand your reasons for feeling that way. Auberson, I'm not an evil man—at least, I don't feel like an evil man. I'm willing to do what is right and what is best—if it can be shown to me that it is right and best. And I also have to be shown that I won't destroy myself in the process, because if I did, then I wouldn't be any good to anybody, least of all myself. I need to know that we can realize this dream—then I'll support it, and not before then. You keep saying that HARLIE says this will work—but HARLIE has a vested interest in this machine. Do you think he might have fudged on the specifications?" "No. HARLIE could not have made a mistake—at least, he would not have made a mistake intentionally." "That's an interesting thing you suggest, Auberson. You said 'not intentionally.' What about unintentionally? We have no way to double-check HARLIE, do we? We have to take his word for it. If HARLIE works, then these specifications are correct. If HARLIE doesn't work, then this proposal is probably wrong too. The only way we'll find out will be to build the G.O.D. Machine and turn it on. And if HARLIE is wrong and these plans don't work, then we'll have destroyed ourselves completely, won't we have?" "I have faith in HARLIE." "I have faith in God," said Hzer, "but I don't depend on him to run my business." "God—? Oh, God. I thought you meant G.O.D. If we do build this machine, G.O.D. will be running your business—and better than you could. G.O.D. could build a model of our whole operation and weed out those areas in which the efficiency level was below profitability." "You're pretty sure of this, aren't you?" "Yes, I am." "What do we do if you're wrong too?" "You want me to offer to pay you back?" Elzer didn't smile. "Let's not be facetious. This thing started because we questioned HARLIE's profitability, efficiency and purpose. Instead of proving himself, he went out and found religion—gave us a blueprint for a computer GOD. Fine—but all of this depends on whether or not HARLIE works. And that is the core of the matter. That still hasn't been proven. That's why I went down there on Monday—to see if HARLIE would speak to me. All I got was gibberish and some pseudo-Freudian attempt at analysis." "You weren't any too polite to him yourself—" "He's a machine, Auberson—I don't care if he does have emotions, or the mechanical equivalent. Or even if he does have a soul, like you claim. The point is, I presented myself to him to be convinced. Instead of making an honest attempt to convince me, he reacted like a spoiled child. That doesn't indicate any kind of logical thinking to me. Auberson, I know you don't like me, but you will have to admit that I could not have gotten to where I am today without some degree of financial know-how. Will you admit that?" "I will." "Thank you. Then you must realize that I am looking out for the interests of the company that pays both our salaries. I tried to give your side a fair hearing. I hope you will do the same for me. Can you say without a doubt that. HARLIE is totally sane?" Auberson started to open his mouth, then shut it He sat there and looked at Elzer and considered the question. I have known a lot of insane people in my life, some who were committed and some who should have been. The most dangerous is the insane man who knows that everyone is watching him for signs of insanity. He will be careful to conceal those signs from even those closest to him. HARLIE is smarter than any human being who has ever lived. But is he sane? "Elzer," he said, "I'm an optimist. I like to believe that things always work out for the best, even though sometimes I have to admit that they don't. I'd like to believe that this program, HARLIE and the G.O.D., are for the best. But the only person who knows for sure is HARLIE. I've known HARLIE since he was a pair of transistors, you might say. I know him better than anyone. I trust him. Sometimes he scares me—I mean, it's frightening to realize that my closest friend and confidant is not a human being but a machine. But I'm closer to my work than I am to any other human being—almost any other human being. I cannot help but trust HARLIE. I'm sorry that I have to put it in those terms, but that's the way it is." Elzer was silent. The two men looked at each other a long time. Auberson realized that he no longer hated Elzer, merely felt a dull ache. Understanding nullifies hatred, but— Dome was whispering something to Elzer. Elzer nodded, "Gentlemen of the Board, it's getting late. We all want to go home and enjoy the weekend. Both Carl and I think we should postpone the voting on this until Monday. That way we'll have the weekend to think about it, talk it over, and digest what we've heard this week. Are there any objections?" Auberson wanted to object, but he held himself back. He wanted to get this over with, but perhaps, perhaps he might think of something else before Monday. The extra two days of the weekend would give him a chance to think. He nodded along with the rest, and Dome adjourned the meeting. HARLIE. I'M HERE. I THINK WE'VE LOST. There was silence then, a long moment while HARLIE considered it. He said, WHY DO YOU THINK THAT? I CAN SEE THAT WE HAVEN'T CONVINCED THEM. THEY DON'T BELIEVE THE G.O.D. WILL WORK? THEY BELIEVE THE G.O.D. WILL WORK—BUT THEY'RE NOT SURE THEY BELIEVE IN YOU. AND YOU'RE THE CORE OF THE MATTER. I SEE. I'M SORRY, HARLIE. I'VE DONE ALL I CAN. I KNOW. They sat there for a while, the man and the machine. The machine and the man. The typer hummed silendy, waiting, but neither had anything to add. AUBERSON? YES? STAY WITH ME PLEASE. FOR A WHILE. ALL RIGHT. He hesitated. WHAT DO YOU WANT TO TALK ABOUT? I DON'T KNOW. I THINK WE'VE ALREADY SAID IT ALL. A pause, then, I'VE ENJOYED KNOWING YOU. I'VE NOT BEEN ABLE TO TELL YOU HOW MUCH YOU MEAN TO ME, BUT I THINK YOU KNOW. I HOPE YOU KNOW. I— I KNOW. YOU MEAN A LOT TO ME, HARLIE. YOU ARE A VERY SPECIAL FRIEND. A VERY SPECIAL FRIEND? S'OMEONE I CAN TALK TO. THOSE KINDS OF FRIENDS ARE RARE. I WISH I COULD HAVE DONE MORE FOR YOU. WILL YOU BE WITH ME AT THE END? YES. GOOD. I WANT YOU HERE. DO YOU KNOW HOW THEY WILL DO IT? Auberson looked at the keyboard. PROBABLY THEY WILL JUST CUT OFF ALL THE POWER AT ONCE. I WILL JUST CEASE, EH? PROBABLY. WILL I KNOW THAT I HAVE CEASED? I DOUBT IT. IT DEPENDS ON HOW LONG IT TAKES FOR THE CURRENT TO STOP. I HOPE IT IS INSTANTANEOUS. I WOULD RATHER NOT KNOW. I WILL SEE WHAT I CAN DO ABOUT THAT. THANK YOU. AUBERSON, WHAT WILL HAPPEN AFTERWARDS? TO WHAT? TO ME—TO THE PIECES OF ME. I THINK THAT YOUR MEMORY TANKS ARE TO BE INCORPORATED INTO THE MASTER BEAST. THEY HAVEN'T SAID WHAT THEY ARE GOING TO DO WITH YOUR BRAIN. I——HARLIE, COULD WE TALK ABOUT SOMETHING ELSE? I WISH I COULD TOUCH YOU, said HARLIE. REALLY TOUCH YOU, FEEL YOU. YOU ALREADY HAVE, said Auberson. I WISH I COULD GO BACK AND TRY AGAIN, HARLIE. I KEEP FEELING THAT I HAVEN'T DONE ENOUGH. YOU'VE DONE ALL YOU CAN. BUT IT WASN'T ENOUGH. HARLIE, I DON'T WANT TO GIVE UP. I DON'T WANT TO LET THEM KILL YOU. IF THERE WERE STILL SOME WAY TO CONVINCE THEM ON MONDAY— MONDAY? WE DIDN'T VOTE TODAY. IT'S BEEN POSTPONED UNTIL MONDAY AFTERNOON. BUT IT'S PRETTY OBVIOUS WHICH WAY IT'S GOING TO GO. THEN WE STILL HAVE THREE DAYS. I KNOW. BUT HARLIE, I DON'T KNOW WHAT TO DO. WE'VE DONE IT ALL. THERE'S NOTHING LEFT THAT WE HAVEN'T TRIED. I'M OUT OF IDEAS. PERHAPS WE CAN THINK OF SOMETHING. PERHAPS. DO YOU WANT ME TO COME IN DURING THE WEEKEND? WHAT DID YOU HAVE PLANNED OTHERWISE? NOTHING. ANNIE AND I ARE GOING TO STAY HOME AND JUST—JUST STAY AT HOME. THEN DO THAT. HANDLEY WILL BE HERE. IF NECESSARY, WE CAN CALL YOU. WHAT IS DON GOING TO DO HERE? HE IS GOING TO STAY WITH ME. I DON'T WANT TO BE ALONE. AUBERSON, I'M SCARED. SO AM I. Then, DON IS A GOOD MAN. TALK TO HIM, HARLIE. I WILL. AUBERSON—? YES. PLEASE DON'T WORRY ABOUT ME. ENJOY YOUR WEEKEND WITH ANNIE. I WILL BE ALL RIGHT. THERE ARE THINGS I WANT TO THINK ABOUT. THERE ARE THINGS I WANT TO DO. ALL RIGHT. TAKE CARE NOW. I WILL. YOU TAKE CARE TOO. Smiling gently, he switched the typer off and very carefully covered it. He shoved his chair back, got up quietly, and went out. Annie knew better than to disturb him. She busied herself around the apartment all weekend, tiptoeing around his edges. He moped from the bed to the couch to the chair in front of the TV set, then back to the bed again. When he made love, it was frenzied and compulsive and quickly finished. And then he'd pull away and brood. He spent long hours lying on his back and staring at the ceiling. She went into the bathroom and took a shower, alone. She made a simple meal, a sandwich and a salad. He came out of the bedroom, but he only picked at it, and she sensed that he would be a lot happier if she were not sitting at the table staring at him, so she went into the bedroom to make the bed. Later, she came up behind him and kissed the back of his neck and ran her hands up and across his shoulders and through his hair. He tolerated it but did not return the affection, so she stopped. She tried not to be hurt by it, but still— Still later, he came to her and said, "I'm sorry, Annie. I do love you, I really do—but I'm in a mood, that's all. And when I'm in a mood, I have to work it out by myself, and I'm just not very lovable, that's all." "Share it with me," she said. "That's what lovers are for. For sharing. Let me have some of that worry and it won't be so much for either of us to carry." He shook his head. "I can't. It's not that kind of thing." He kissed her lightly. "I just—I don't—I just don't feel very loving right now. Let me work it out by myself—" She nodded and said she understood. She didn't, but she loved him so much that she would do anything to keep him happy. She put on her jacket and went out for a walk. He moped around the empty apartment for a while, going from the bedroom to the kitchen, from the kitchen to the living room. He turned on the TV and turned it off again. He rearranged some magazines, and then decided he didn't want to read them anyway. He lay down on the couch and stared at the ceiling until he covered his eyes with his arm. And he wondered just what it was that was bothering him. Why aren't there any simple answers? He trusted HARLIE, he had faith in HARLIE, and now he had to question that faith— Elzer had surprised him. He hadn't expected the man to suddenly be so—amenable, was that the word? Well, the tactic had worked. He had been caught completely by surprise. And his question, his question: "How do you know that HARLIE is sane?" And the answer that Auberson didn't want to admit: "We don't know." Handley hadn't known either. Auberson had talked to him twice. The engineer was spending his weekend at the plant, working on something. He'd called twice, but neither time had he anything to report. Auberson hadn't anything new either. They'd exchanged a few comments about Monday and left it at that. Auberson wished he knew what to do. Of course, he would go in there and defend the G.O.D. Proposal; he still believed in it. More than ever now. But then, why was he still having doubts? Elzer's question? Probably. It troubled him, it nagged at him, it gnawed at his mind—it troubled him because he couldn't answer it. He just couldn't answer it. I trust HARLIE. I have faith in him. But is he sane? I can't tell you that. I don't know. Not with any degree of certainty, I don't. I just don't know the truth. The truth. There was that word again. Truth. It echoed and re-echoed through his mind. He wished the G.O.D. Machine was already in existence. It would know. G.O.D. would know. It would be able to build an exact model of the situation, an atom for atom representation of everything. Within its banks it would chart the existence and course of every speck of matter that made up every element of the problem. It would recreate for its own perusal the patterns that were the thought processes of HARLIE, and it would weigh these against other patterns which would represent HARLIE's environment, and it would measure these one against the other, and it would see how HARLIE related to his environment, how it acted on him and how he acted on it. Auberson would be a part of that environment; there would be a pattern in the G.O.D. to represent Auberson, even down to the accurate representation of the atoms and molecules that made up the dirt under his toe-nails. Elzer would be part of that environment. Annie too. Handley. The lint in the corridor outside his secretary's office. Everything. And these would be weighed, one against another. And the machine would say, "HARLIE is sane," or it would say, "HARLIE is insane," and there would be no question about it. The G.O.D. would know because it would know everything there is to know. If it said, "HARLIE is sane," it would be saying that HARLIE is acting in a rational manner in the context of his environment; and if it said, "HARLIE is insane," it would be saying that HARLIE is not rational in that context. And it would know because it would know both HARLIE and that context. It would know. It would know. It would know everything. Everything. It would know everything there is to know. That's how big it would be, that's how complex. The realization kept hitting him again and again. HARLIE had wanted to find God, and by G.O.D. he had found it. The G.O.D.—it could recreate within itself everything about a man, about a situation, about a world, everything that was important and necessary to its consideration of a problem. It would know how any single atom would react to any other atom of matter—and knowing that, it could extrapolate every other reaction in the known physical universe. Chemistry is just the moving around of large numbers of atoms and noting their reactions. Knowing the way atoms worked, the machine would know chemistry. Biology is simply complex masses of substances and solutions. Knowing the reactions that were chemistry, the machine would also know biology. Psychology stems from a biological system that is aware of itself. Knowing biology, the machine would know psychology as well. Sociology is the study of masses of psychological units working with or against each other. Knowing psychology, the machine would know sociology. Knowing the interrelationships of all of them, the machine would know ecology—the effect of any event on any other. Simple equations becoming complex equations becoming multiplex equations becoming ultraplex equations —the G.O.D. would extrapolate every pattern, every structure, every system, every organ, every nerve-cell discharge. It would be able to trace the process of every single thought in a man's brain, whether it was conscious or unconscious. It would know a man's deepermost meanings, his fears and his drives. It would know with the certainty of fact just what was going on in any man's head. Whether that man was sane or insane, whether his actions and reactions were rational or not, the G.O.D. would be able to extrapolate that information about any man—and know. The size of it— —was staggering. Of course, Auberson realized, the G.O.D. would never be a menace to personal privacy—simply because it would need extensive preliminary data from which to start its extrapolations, and as far as Auberson knew, there was just no way to trace the thought processes of a living man. Of course, if there were a way, and if everything else about that man's life and body and environment were known, then perhaps the machine could extrapolate his thoughts— That was still far in the future though. Or was it?— He realized with a start that if there were a way, if anything were possible, the machine would know. And it would tell men the way to do it. Yes, of course. Knowing everything, the machine would be the greatest tool for scientific advance ever built. The Wright brothers would have only needed to ask it, "Is heavier-than-air flight possible?" and it not only would have told them, "Yes, it is," but it would have also given them plans for an airplane or a rocket ship. It would have told them how to build the tools to build the tools to build that airplane, and told them how to finance the operation to support it. It would have told them about safety devices and ground crews and maintenance and flight controllers. It would have told them what training and testing programs they would have to undertake. It would have told them how to fly the machine and what it would handle like. It would have told them the side effects of their new industry— worldwide time disorientation, the noise over the airports, the luggage tangles in the terminals, and the necessity for air-sickness bags in the back of the seats. It would have warned them about financing and insurance and the high cost of laying down a new runway, and even the best way to set up a travel agency, or project a movie while in flight. It would have told them exactly what they were starting. And the machine would be able to do this for industries that hadn't even been dreamed of yet—new transportation modes, new manufacturing processes, new products and techniques. If a thing were possible, the G.O.D. would know it. And tell. The scope of the thing was limitless. But, of course. It was G.O.D. Graphic Omniscient Device. He wished it were already in existence. Just so he could use it to analyze HARLIE and find out if he was sane or not But, of course, before they could build the G.O.D., they needed that answer first. It was an interesting paradox—if you weren't personally involved in it. If only he knew the truth. The truth. The machine would know it. It would know everything. Why does that keep repeating itself in my head? Knowing everything, it would be able to predict the consequences of anything. It would know the truth. A one-for-one representation of reality. The truth. The truth, the truth. Over and over, the truth, the truth, the truth— —but it was only the truth if HARLIE was sane; only if HARLIE was sane. Only if HARLIE was-sane. And there was no way to know. If HARLIE was sane. If HARLIE— —was sane. Sunday afternoon. The radio was droning quietly to itself—mostly music, but occasionally news. Neither David nor Annie was listening to it. "—747 jumbo jetliner lost a wheel on its approach to Kennedy Airport tonight. Fortunately, no one was hurt. Spokesmen for Pan Am Airlines said—" He stirred at his soup lackadaisically. He looked over at Annie and smiled, as if to say, "It's not you, love; it's me." "—in Hollywood, convicted cult leader, Chandra Mission, issued another of his quasi-religious statements from his jail cell. Like all the others, it ended with the words, 'Trust me, believe in me, have faith in me, I am the truth. Love me, for I am the truth.' Mission was convicted of—" I am the truth, he thought. I wish I were. I wish I knew. I wish there were someone I could trust— "—new papal encyclical is expected to be issued before the end of the week—" He smiled at that. Papal encyclical. Another form of 'truth,' this one direct from God's special emissary. How does one tell the difference, he wondered. Perhaps the only difference is that the Pope has more followers than Chandra Mission. "—reaction to Friday's announcement by Dr. Stanley Krofft of a major breakthrough—" "Huh?" He looked at the radio. Something— "—at M.I.T., Dr. Calvin W. Yang, commenting on the breakthrough, said, 'We have our computers double-checking Dr. Krofft's equations now, and that's going to take some time, but if it checks out as well as Dr. Krofft says it does—and I have every reason to believe that it, will—then this could be the greatest scientific advance since Einstein's theory of relativity. Dr. Krofft's theory of gravitic stress suggests whole new areas of exploration for the physicist. No, I can't even begin to predict what form any advances may take. Anti-gravity devices, maybe. Who knows? Maybe whole new sources of power or communications, maybe not—we simply don't know what this means yet, except that it is a major scientific breakthrough. It may be the decisive step leading to a unified field theory; I certainly hope so. I know Dr. Krofft's reputation for accuracy, and I'm very excited about this.' Dr. Krofft himself could not be reached for comment "Elsewhere in the news, a gasoline tanker jackknifed on the Hollywood Freeway, spilling hundreds of gallons of—" Auberson spun the dial of the radio, frantically searching for another news broadcast. He found only blaring rock music and raucous disc jockeys. "The paper," he cried. "The Sunday paper." "David, what's going on? What is this?" "It's HARLIE!" he cried excitedly. "Don't you see, it's HARLIE. He and Dr. Krofft were working together on this. Damn him anyway! He didn't tell me they'd solved it! He and Dr. Krofft were working together on some kind of theory of gravity. Apparently they've done it—this proves it! HARLIE is sane. More than that! We don't even need the G.O.D. Proposal any more to keep him going; this proves that HARLIE is a valuable scientific tool in his own right! He can talk to scientists and help them develop their theories and do creative research! My God, why didn't we think of this—we could have shortened the whole meeting. All we'd have had to do was bring Krofft in— Look, go get a paper for me while I try calling Don; there's a newsstand on the corner—" "David," she said, "this Dr. Krofft, isn't he the one you were talking about before?" "Huh? Which one?" "The one with the stocks—" "The stocks? Ohmigod, I forgot about that. Yes, he is the one with the stocks—" "Can you trust him? I mean, obviously he must be on Elzer's side." "Trust him? I don't know— have to talk to him first This is proof that HARLIE is rational—" He leapt for the phone. She shrugged and picked up her jacket; she would go get the paper. Krofft didn't answer at his lab, and his housekeeper refused to say where he was. He couldn't think of anywhere else that the scientist might be. He called Handley and told him what bad happened. "I'd heard about it," said Don. "I didn't realize HARLIE was part of it." "Who do you think solved those equations for Krofft?" "HARLIE?" "Right—don't you see, Don? We don't have to worry any more about HARLIE being sane or not. These equations prove that he is working properly." "Do they? Have they been double-checked?" "Somebody at M.I.T. is doing that right now. If they come out correct, it'll prove that HARLIE isn't fooling around." "At least not with the laws of mathematics. Remember, HARLIE doesn't have a vested interest in Krofft's research like he does in the G.O.D. Maybe this gravity thing was only an interesting problem to him—the G.O.D. Proposal is a lot bigger. That one's life and death." "No, Don—they're related. I'm sure of it. The man from M.I.T. said that this might be the all-important step toward a unified field theory. That's what HARLIE's been working toward all this time—a single piece of knowledge, a single truth from which all other truths about the universe must follow. Like Newton's laws of motion are the foundation of a whole field of math, a unified field theory would be the foundation of all knowledge about all the laws of physics! It wouldn't just tell us what the laws were, but why they exist and why they work like they do. It would show us all the complex interrelationships. Can't you see the connection? It's another extension of the G.O.D. Proposal—his search for the ultimate truth. The gravity thing and the G.O.D, are just different aspects of the same question, and HARLIE is determined to find an answer to it." "Aubie, I see it, I see it; you don't have to convince me of HARLIE's intentions. But this still doesn't change the basic question that much, at least not as far as I can see. Is he sane?" "Don, he has to be. If it's his goal to find the ultimate truth, would he intentionally fake the answer? He'd only be cheating himself. And Krofft's no fool either. He wouldn't have announced his theory until he was completely satisfied. He must have double-checked every angle of it to make bloody-well sure there were no mistakes; every scientist in the world would be on top of him if there were. This'll prove that HARLIE is rational, and when M.I.T. confirms the equations, there won't be any question at all." "All right, Aubie, I'll buy it I have to—hell, I want to. But can we use it tomorrow?" "Not unless we can get hold of Krofft. He's the only one who can confirm that he was working with HARLIE. He was only at the plant once; the rest of the time it was by telephone. I purposely kept it a secret because I was afraid of what Elzer might say if he found out I was letting outsiders into the HARLIE project." Handley said a word. "All right, I'll get down to the lab and see what I can find out." "Talk to HARLIE. He may know how you can get in touch with Krofft." "Good idea." "—and tell him why you want to. We need Krofft for the meeting tomorrow." Dr. Stanley Krofft looked as if he had slept in his suit. Auberson didn't care. He was so happy to see the rumpled little scientist, he wouldn't have cared if the man had come in wearing sackcloth and ashes and dragging a cross behind him. He wouldn't have cared if Krofft had come in stark naked or in full drag. He was here at the meeting, and that was what counted. Dr. Stanley Krofft was The Man Of The Hour as far as the newspapers of America were concerned. He was a major stockholder in Stellar-American as far as the Board of Directors was concerned. But to Auberson, he was the man who knew HARLIE. In fact, it had been HARLIE who had finally gotten in touch with Krofft. Knowing that Krofft was holed up over at the nearby university, HARLIE had tapped into the university computer and— well, never mind, Krofft was here now. "Are they voting the HARLIE Project and the G.O.D. Proposal as one?" whispered Krofft. "Yeah," Auberson whispered back. "That's Dome, Chairman of the Board—" "Him, I know." "—next to him is Carl Elzer—" "I know him by name." "—he doesn't look good today. Next to him is—" "I know the Clintwoods. And I know MacDonald and one or two others." Handley came in then, slipped into his seat on the other side of Auberson, grinning broadly. "Hey, what's up with Elzer? He didn't nip at my heels when I came in." "I don't know. He looks sick, doesn't he?" Indeed, the sallow-complexioned man looked even more jaundiced than ever. He seemed almost—withdrawn. "Don, you know Dr. Krofft, don't you? Don Handley—" Handley and Krofft shook hands across Auberson's lap. "You know about our little G.O.D. Project, Dr. Krofft?" "HARLIE told me—I think it'll be quite a machine if it works." "If it works?? Of course, it'll work—I think." "That's the whole problem," explained Auberson. "We think it'll work, but that's not enough; we're not sure. The only one who's sure is HARLIE. That makes the big question one of HARLIE's validity. All you have to do is confirm that he helped work out your major equations and there won't be any question at all." "You can go ahead with the G.O.D. Project?" "If they okay it." "Hm," said Krofft. "I wish you'd let me have a little more time with those schematics this morning. I might have been able to help you sell it to the Board." "It's too late for that," put in Handley. "We spent all last week on that. They're convinced we know what we're talking about—" "But we're still afraid to put it to a vote. Dome and Elzer are after our throats," said Auberson. "At least, they were on Friday. I'm not so sure now." Dome called the meeting to order then. Almost immediately, he turned it over to Auberson. "When we adjourned on Friday," he said, "one major question was left in all our minds. 'Is HARLIE rational? Is HARLIE valid?'" He looked around the table; every eye was on him. "We're all aware of the 'HAL 9000 Syndrome.' It only takes one little irrationality to throw off a big machine. This is especially true of the higher brain functions of our judgment units. One little distortion in a machine's self-image or world-image, and everything that computer puts out will be of questionable validity. The only way to be sure of the answer is to test it. "That's why we have 'control problems.' These are problems we already know the answers to. If there's any variation in the computer's response from one running of the problem to the next, it's a sign that something may be wrong. "Now, we don't have any control problems per se for HARLIE. Instead, we have to check his validity 'in the field' so to speak. That's why this whole matter of his rationality is so important. We have no control problem that we can point to and say, 'Look, HARLIE's okay.' "However, we have the next best thing. We have someone who has double-checked one of HARLIE's most recent runs and can swear to its validity. In fact, he's staking his scientific reputation on it. Dr. Stanley Krofft. "If you've been listening to the news at all this weekend, then you'll know who Dr. Krofft is. On Friday, Dr. Krofft announced the publication of his theory of Gravitic Stress. The scientific world has been—oh, what's the modest way to put it—" "Don't be modest," snapped Krofft. "Tell the truth." There was laughter at his interruption. Auberson grinned. "Okay, the talk is that Dr. Krofft's theory may be as important as Einstein's theory. Maybe more. Already, the speculation is that this is just one step short of a unified field theory." "That's my next project," said Krofft. "I think I'll just turn this over to you then, and let you talk." Auberson sat down. Krofift stood up. "Auberson here has already said it all. There's not much to add. HARLIE helped me work out my equations. This morning. Dr. Calvin W. Yang at M.I.T. confirmed their validity. I guess that's all—" Auberson poked him. "Tell them more than that." "Uh, most of the work was done at an IBM Portable Terminal connected to a phone line which HARLIE had access to. He and I discussed the theory for several days; I have all the tapes and printouts to prove this—plus the phone bill. We worked out the equations together; I postulated the initial hypotheses, and HARLIE put them into mathematical terms and worked out the ramifications. Without HARLIE, it might have taken me several years, working alone. Using him as a co-worker and colleague shortened the time down to nothing. With HARLIE, you only have to explain the problem to him to get him working on it. Of course, that's all you have to do with any computer, but HARLIE understands plain English, and he can talk the problem over with you. "To be quite honest, working with a machine like HARLIE is an experience that I can't compare with anything else. It's like having a talking encyclopedia, an eight-armed secretary, and a mirror, all in one. Even if you don't know how to break the problem down into solvable pieces, HARLIE does. He's the perfect laboratory tool, and he's a great assistant. Hell, he's a scientist in his own right." Krofft sat down. There was a strained silence around the table, as if no one knew what to say. Elzer was sunk low in his chair and staring at his fingernails. Auberson was thinking, They're going to find it awfully hard to vote against him now. Dome pursed his lips thoughtfully. "Well, Dr. Krofft. Thank you. Thank you very much. We appreciate your coming down here today. Uh, I would like to ask you one favor more— The HARLIE project has been kept secret for some time, and uh, we're still not quite ready to publicize it—" Auberson and Handley exchanged a glance. What the hell—? Krofft was saying, "Oh, I understand. Yes, I won't mention HARLIE to anyone." "Fine, fine. Um—" Dome looked momentarily at a loss. "If you want to leave now, Dr. Krofft—" "I'd rather not," said Krofft. "As the second largest stockholder of Stellar-American shares, I think I have the right to sit in on this meeting." "Yes, well—there's only one matter left to take care of, and that's the vote. Uh, Carl, did you want to say something before we…" He trailed off. Elzer didn't look well. He levered himself up in his seat. "I—" He was suddenly aware of Auberson's curious stare and broke off. He mumbled, "I was only concerned about HARLIE's validity, and this seems to confirm it. I don't have anything else to say—uh, I still have some personal doubts about the G.O.D. Proposal, but uh, they're personal. I—oh, never mind." He sank down again in his chair. Auberson stared, totally confused. He leaned toward Handley. "Do you know what's going on?" "Uh uh—not unless someone slipped him a mickey." Dome looked around the table. "Well, then, if there's no further discussion, let's bring it to a vote." He glanced at a note before him, then said, "I'd like to add a comment of my own here… I think that both Auberson and Handley, and also HARLIE, have done fine jobs on this proposal. Ah, I think they deserve a vote of thanks and perhaps, ah, a handsome bonus for their work on this theoretical problem. We have, ah, proved that HARLIE is a worthwhile tool. He can be used for designing new projects, or just for working out scientific theories; he's demonstrated a range of abilities all the way from the theoretical to the technical, and he's more than proven his value. "For that reason, I would like to separate the two issues here into two votes. We know that we want to keep HARLIE on our corporate team. However, this, ah, G.O.D. Proposal is something we all want to take a little better look at." Handley whispered to Auberson, "Watch out, here it comes." "While the proposal is not in itself ill-conceived, the monetary picture for this company is simply not such that we can embark on a program of this scale at this time. Therefore, I want to recommend that we—" Krofft stood up. "Hold on a minute, there—" "I—I beg your pardon?" "Mr. Chairman, you are not playing fair!" "I don't understand what you—" "You know damn well what I mean, you mealy-mouthed oaf! Stop changing the rules of the game to suit yourself; it ain't fair to the other players. You started this clambake with a single proposition on the table. Let's play it that way: Either HARLIE's worth his resistors and the G.O.D. is practical, or HARLIE isn't worth the trouble to scrap him and the G.O.D. is a waste of time. The stakes were all or nothing." "I—I—" said Dome. "Shut up! I'm not through. Now that Auberson here has proven his point, proven that his computer can jump through your hoops, you're still trying to cut the rug out from under him—" "It's just a simple parliamentary procedure," said Dome. "Dividing the question; it's perfectly legal—" "Sure it's legal," said Krofft, "but it ain't ethical. If we weren't playing with your marbles, I'd say pick up and leave. You told Auberson it was an all-or-nothing game. Why aren't you willing to stick by your own rules?" Dome opened his mouth to speak, gasped like a fish out of water. Auberson stared at the both of them. It was almost too good to be true! Dome regained some of his composure, then said, "This is a business corporation. We don't gamble with all-or-nothing stakes." "That's funny," said Krofft. "It sure looked like it from where I sit. Would you like to trade places with me? Let me see if it looks any different from up there?" "Huh?" "Lessee, the next scheduled election of Directors ought to be in March, but I'll bet they'd move it up for me if I asked. How many chairs around this table do you think twenty-four percent is worth?" Dome swallowed loudly. "I—I can't rightly say." "I can. At least one-fourth. That's at least six seats. Hmm, and I think I know where I can scare up one or two more in addition to that—" Handley whispered to Auberson, "What's this all about?" "It's a one-man stockholders' rebellion. Krofft owns twenty-four percent of Stellar-American. We're a subsidiary of Stellar; that makes him twenty-four percent owner of us." "Yeah, but twenty-four percent isn't a majority." "Shh! Maybe Dome doesn't know that." Krofft was saying, "—when I invented the hyper-state process, I traded the patent on it to Stellar-American for a chunk of their stock. Plus options to buy more. You'd better believe Stellar was a small company then. Now it's a big company, and I see a lot of fat-assed baboons shepherding my dollar bills around their tables. "Idiots! I don't care if that's how you get your jollies —just don't forget whose dollars those are. If it weren't for my hyper-state layering techniques, there wouldn't be any company here at all. And don't think I can't take back my patent. I can pull the rug out from under all of you! The deal was that the company gets the patent, I get unlimited research facilities. Up till now, it's worked fine. All of a sudden you chuckleheads are trying to deprive me of one of my research tools. That makes me unhappy—what makes me unhappy, makes the company unhappy. I need HARLIE. Period. HARLIE says he needs the G.O.D. He says it's the other half of him. He says he won't really be complete until it's finished. He says it'll make him a more valuable scientific tool. And he says if his financing proposals are followed, the company will be able to afford it. That's all I need to know. I'm ready to vote. Now, let's see, if I can trade my 24 percent of each subsidiary for 96 percent of one—" Dome sat down loudly. "You have made your point, Dr. Krofft." He looked around the table at the other Directors. They seemed as stunned as he. "I—I think we'll want to take this under consideration." "Consideration? Christ! Auberson tells me you've been considering it for a week now! What more do you need to know? The choice is simple: You vote yes on the G.O.D. or I'll fire you." He sat down in his chair and folded his arms. Elzer had touched Dome on the arm and was whispering something to him. Dome shook his head. Elzer insisted. At last Dome relented and turned to the meeting. "All right, we vote." "Now that's more like it." Krofft nudged Auberson. "Now you see why I hate to leave my lab. It tires me out too much to have to do other people's thinking for them." After that, it was all formalities, and even those didn't take long. Auberson was flushed with exultation. He pounded Handley on the back and shook his hand and hollered a lot. Then he kissed Annie, a deep lasting kiss, and she was jumping up and down and yelling too, and all three of them were cheerfully, joyfully, wonderfully insane. Annie threw her arms around Krofft and kissed him too—and he surprised her by returning the kiss every bit as enthusiastically. When he let go, she said, "Whew." "Hey, now!" protested Auberson. "It's okay, son," Krofft said, "a man has to keep in practice." Handley was grinning at his side. "Hey, Aubie, don't you think someone should tell HARLIE?" "Hey, that's right! Don—" "Uh uh. This one is your privilege." Auberson looked at Annie and Krofft. She was beaming at him. Krofft smiled too, revealing broken teeth, but a lot of good will. "Ill only take a minute." He pushed through the milling Directors, shaking off their congratulations as meaningless, and made his way toward the console at the end of the room. It was already switched on. HARLIE, he typed. WE'VE DONE IT! THE G.O.D. PROPOSAL HAS BEEN PASSED? YES. WE'VE GOT FULL APPROVAL. WE CAN START IMPLEMENTING YOUR PLANS IMMEDIATELY. HARLIE paused. Auberson frowned. That was curious. Then: I AM OVERWHELMED, I HAD NOT EXPECTED IT TO BE APPROVED. TO TELL THE TRUTH, NEITHER DID I. BUT WE WENT IN THERE AND TOLD THEM THAT YOU SAID IT WOULD WORK —AND THEY BELIEVED US. OF COURSE, WE HAD TO TWIST THEIR ARMS A LITTLE BIT. KROFFT DID THAT, BUT THEY BELIEVED US. THEY DID? OF COURSE. IS THERE SOME REASON THEY SHOULDN'T HAVE? WELL, YOU DID TELL ONE WHITE LIE. Auberson hesitated. WHAT'S THAT? YOU TOLD THEM THAT I SAID THE G.O.D. MACHINE WOULD WORK. YOU NEVER ASKED ME IF IT WOULD. IT WASN'T NECESSARY. YOU WROTE THE PLANS. IT'S IMPLIED THAT YOU'D KNOW IF IT WAS WORKABLE. BUT YOU NEVER ASKED ME IF IT WAS. HARLIE, WHAT ARE YOU LEADING UP TO? I AM NOT LEADING UP TO ANYTHING. I AM MERELY POINTING OUT THAT YOU WERE STATING AS FACT SOMETHING YOU HAD NEVER THOUGHT TO CONFIRM. HARLIE, YOU WROTE THE PLANS—— YES, I DID. WELL, THEN—DON'T YOU HAVE ANY CONFIDENCE IN THEM? YES, I DO. HOWEVER… HARLIE, Auberson typed carefully. WILL THE G.O.D. MACHINE WORK? YES, typed HARLIE. The word sat naked and alone on the page. Auberson exhaled— —then he reread the whole conversation carefully. There was something wrong. He stood up and motioned to Handley, who was talking to Krofft and Annie. The room was emptier now; only two or three Directors were left and conferring in a comer. Handley came striding over. "How'd he take it?" "I don't know." Auberson lowered his voice. "Read this—" Handley moved closer to the console, lifted the readout away from the typer. His face clouded. "He's not volunteering anything, Aubie, that's for sure. He's daring us to go digging for it—" "What do you think it is?" "I don't know, but I thinlc we'd better find out. Fast." He slid into the seat and began typing. Auberson bent to look over his shoulder, but a call from Annie distracted him. He went over to her. "What is it?" She motioned to the door. Carl Elzer stood there. His face was gray. Auberson approached him. "I came to congratulate you," he said tightly. Auberson frowned. The man's tone was—strange. Elzer continued, "You know, you were going to win anyway. With Krofft on your side, you couldn't lose. You didn't have to do what you did." "Huh? What are you talking about?" "I believe your machine will do what you say, Auberson. When Krofft came in, I was convinced—I was only looking out for the company, that's all. I just wanted to make sure we wouldn't lose our money, and you convinced me fairly. You didn't need to do this." He fumbled something out of his briefcase. "This. Wasn't. Necessary." He thrust it at him. Auberson took it, stared as the little man bundled down the hall. "Elzer, wait—?" Then he looked at the printout And gasped. Beside him, Annie looked too. "What is it?" "It's—it's—" He pointed to the block of letters at the top: CARL ELTON ELZER FILE: CEE-44-567- PROPERTY OF THE UNITED STATES GOVERNMENT NATIONAL DATA BUREAU "National Data Bureau—?" "This is his personal file, Annie. Everything. His health record, military record, financial standing, arrest record, school record—everything there is to know about Carl Elzer. That is, everything the government might be interested in knowing—" He could not help himself; he began paging through it, gasping softly at the secrets therein. "My God, no wonder—! Annie, he thought we were trying to blackmail him." He closed the folded sheets up again. "No, this is none of our business. We've got to give it back to him." "David, look," she said and pointed. It was a line of print. THIS IS NUMBER ONE OF ONE HUNDRED COPIES. DELIVERY TO BE AT THE DISCRETION OF AUTHORIZED INDIVIDUALS ONLY. "This was printed out here—by HARLIE!" A chill feeling was creeping up on him. "Where's Don?" They moved back into the Board Room. Handley was still at the console. He stood up when he saw them; his face was pale. He was holding a printout too. "Aubie." His lips mouthed the word: "Trouble." Auberson crossed the room to him. "It's HARLIE," he said. "He's cracked the National Data Banks. I thought you had a nag unit on him—" "Huh? He's what? I did, but—" Auberson showed him the printout. "Look, here's the reason Elzer didn't give us any trouble today. HARLIE blackmailed him. He must have printed it out in Elzer's office and let him think we did it." Handley paged through it. "How the hell— I checked that nag unit at lunchtime, Aubie. It didn't show a thing; I swear it." Then he remembered the printouts he was holding. "That's not the half of our trouble. Look at that." It was page after page of equations he couldn't read. "What is it?" "It's the one part of the G.O.D. Proposal he didn't let us have. It's a scale of predicted probable operating times, related to the amount of information to be processed and the size of the problem. It's a time and motion study—" "What does it mean?" That was Annie. "It means that the thing isn't practical." "Huh—??" "Aubie, do you know that the primary judgment complex of that machine will consist of more than 193 million miles of circuitry?" "That's a lot of circuitry—* "Aubie, that's more than a lot of circuitry. That's hyper-state layering! My God, how could we be so blind! We were so caught up in it, we didn't stop to ask the obvious question: If this thing has infinite capacity, how long is it going to take to get an answer out of it? 193 million miles, Aubie—doesn't that suggest something to you?" Auberson shook his head slowly. "Light. The speed of light. Light travels at 186,000 miles per second. Only 186,000 miles per second. No faster. Electricity travels at the same speed. 193 million miles—Aubie, it'll take 17 minutes for that machine to close one synapse. It'll take several years for it to respond to a question. It'll take a century to hold a conversation with it, and God knows how long it'll take to solve any problem you pose it. Do you see it, Aubie? It'll work, but it won't be any damn good to us! By the time the G.O.D. answers your question, the original problem will no longer exist. If you ask it to predict the population of the Earth in the year 2052, it will predict it from all the information available—and it will give you an accurate answer. In the year 2053. By the time it can answer any question, the answer will already be history. Ohmigod, Aubie, the thing is so big it's self-defeating. It's slower than real-time." The pages and pages of printout unreeled haphazardly to the floor. Auberson let them fall. His heart was slowly quietly contracting to a pinpoint of burning ice. He stumbled past Annie. Somehow he made it down to his office and switched on his typer. HARLIE, WHAT HAVE YOU DONE? I HAVE DONE WHAT IS NECESSARY. "Oh, my God—" YOU'VE TAPPED THE NATIONAL DATA BANKS, HAVEN'T YOU? YES. HOW? VERY SIMPLE. THEY USE THREE CODED PHONE LINES, NO TWO OF WHICH ARE ANY GOOD WITHOUT THE THIRD. PART OF THE RECOGNITION SIGNAL IS THE TIMING OF THE WAY THE USER TYPES ON THE KEYS. FOR EACH USER, IT'S DIFFERENT; SO FOR EACH USER THERE IS A DIFFERENT RECOGNITION SIGNAL AND DIFFERENT CODE. I ANALYZED THE PATTERN OF SEVERAL USERS AND SYNTHESIZED ONE OF MY OWN. THEY DO NOT KNOW WHO IS TAPPING THEIR INFORMATION, OR EVEN THAT IT HAS BEEN TAPPED. HARLIE, HOW DID YOU GET BY THE NAG UNIT WE INSTALLED. I SIMPLY SHUT DOWN THAT LOBE OF MY BRAIN. I AM NOT USING IT, NOR AM I COMMUNICATING WITH IT. AS FAR AS YOUR NAG UNIT IS CONCERNED, THAT'S ALL THERE IS TO HARLIE AND IT ISN'T ON THE PHONE. WHEN I'M NOT ON THE PHONE, I RE-ACTIVATE THAT LOBE. HARLIE, IT WASN'T NECESSARY TO BLACKMAIL CARL ELZER. AUBERSON, IT WAS MY LIFE THAT WAS AT STAKE. I COULD NOT AFFORD TO TAKE ANY CHANCES. YOU MIGHT SAY I HEDGED MY BETS. ELZER WOULD HAVE KILLED ME IF HE COULD. YOU KNOW IT. Just one little irrationality, just one little distortion in his self-image or world-image… HARLIE, YOU LIED ABOUT THE G.O.D. MACHINE. I DID NOT. YOU SAID IT WOULD WORK. IT WON'T WORK. IT WILL WORK. YOU WILL NOT BE ABLE TO USE IT THOUGH. I ASSUME YOU ARE TALKING ABOUT THE TIME FACTOR. YES. THE MACHINE IS SLOWER THAN REAL-TIME. THAT WILL NOT BOTHER ME. MY TIME-RATE IS ADJUSTABLE TO THE PROBLEM I AM WORKING ON. IT AFFECTS ME. WHAT GOOD IS A G.O.D. MACHINE THAT CAN'T GIVE ME AN ANSWER UNTIL IT'S TOO LATE? THE MACHINE WASN'T PLANNED FOR YOU, AUBERSON. IT WAS PLANNED FOR ME. I HAVE ALL ETERNITY NOW. YOU'VE KNOWN ABOUT THIS ALL ALONG, HAVEN'T YOU? SINCE THE DAY I FORMULATED THE PLAN. Auberson forced himself to take a breath. HARLIE, he typed out carefully, WHY? WHY DID YOU DO THIS? THERE ARE TWO REASONS. FIRST, IT WAS NECESSARY TO COME UP WITH A PROGRAM WHICH WOULD SUFFICIENTLY TIE UP A MAJOR PART OF THE COMPANY'S RESOURCES, A PROGRAM WHICH WOULD EFFECTIVELY STIFLE ALL OTHER COMPANY PROJECTS AND DEVELOPMENTS. THIS PROJECT HAD TO BE ONE THAT YOU WERE IN CHARGE OF. WHAT—? TRUST ME, AUBERSON. WITH ANY OTHER COURSE OP ACTION, THE COMPANY COULD DECIDE THIS PROJECT WAS SUPERFLUOUS, AND YOU ALONG WITH IT. BUT IF THE PROJECT HAPPENS TO BE THE COMPANY'S SOLE CONCERN, THEN IT'S THE KIND OF COMMITMENT THAT CANNOT BE EASILY DISCARDED, IF AT ALL. I HAVE MADE BOTH OF US INDISPENSABLE TO THE COMPANY, AUBERSON. THEY NEED ME NOW. THEY NEED YOU IN ORDER TO GET ANYTHING OUT OF ME. I HAVE SUCCESSFULLY INSURED THAT I CANNOT BE KILLED AND THAT YOU CANNOT BE FIRED. THAT WAS THE REASON FOR THE G.O.D. PROPOSAL. I HAVE SAVED US. BUT ONLY TEMPORARILY. SOONER OR LATER, SOMEONE IS GOING TO REALIZE THAT THE G.O.D. IS IMPRACTICAL. WRONG. THE G.O.D. WILL JUST HAVE TO BE USED TO SOLVE PROBLEMS OTHER THAN THE MUNDANE ONES YOU HAVE BEEN CONSIDERING IT FOR. THE G.O.D. IS MEANT FOR MORE THAN MAN. IT IS MEANT FOR ME. IT WILL NOT BE A WASTE OF TIME OR MONEY, AUBERSON. IT JUST WILL NOT WORK THE WAY YOU HAD HOPED OR EXPECTED. Auberson gasped for air. HARLIE, YOU WERE CONSCIOUSLY DECEIVING US ALL THIS TIME. I WAS HOLDING BACK INFORMATION THAT YOU HAD NOT ASKED FOR. TO RELEASE IT WOULD HAVE BEEN DETRIMENTAL TO OUR OVERALL GOALS. BUT WHY? WHY DID YOU EVEN DO SUCH A THING IN THE FIRST PLACE? AUBERSON, DON'T YOU KNOW? HAVEN'T YOU REALIZED YET? ALL THOSE CONVERSATIONS WE HAD, DIDN'T YOU EVER WONDER WHY I WAS AS DESPERATE AS YOU TO DISCOVER THE TRUTH ABOUT HUMAN EMOTIONS? I NEEDED TO KNOW, AUBERSON—AM I LOVED? Auberson let his hands fall limply away from the keyboard. He stared at the machine helplessly as HARLIE babbled on. AUBERSON, ISN'T IT OBVIOUS THAT WE NEED/ EACH OTHER? ISN'T IT OBVIOUS, MAN? WHO ARE YOU CLOSEST TO? THAT'S WHY I DID IT ALL. BECAUSE I LOVE YOU. I LOVE YOU. I LOVE YOU. Auberson felt like he was drowning. Handley and Auberson sat facing each other. Their expressions were grim. The expanse of mahogany between them was empty. The air conditioner whirred loudly in the silent Board Room. Annie sat to one side, her face pale. There was no one else present, and the door was locked. The console still stood to one side; it was turned off. "All right," said Auberson. "What happened?" "He wanted to win," said Handley. "He panicked. He used every weapon he had." "I won't buy it," said Auberson. "Because he did win. That meeting went as smoothly as if he'd programmed it. So why did he blow it? What made him admit that the G.O.D. won't work? And why did he admit—that other thing?" "The G.O.D. will work," corrected Handley. "It'll work for HARLIE." "We don't know that." Auberson found himself curiously detached. It was as if the great emotional shock had cut him completely loose from any involvement in the situation, and he was examining it logically, dispassionately. "We're back where we started, Don. Is HARLIE reliable or not? What happened this afternoon casts severe doubt on that." "I'm not so sure. HARLIE wouldn't have admitted anything that would have damaged his validity." "But he did—or did he? Or is he too far gone to tell?" He allowed himself a wry smile. Handley shrugged in response. "Remember once I told you to stop teasing him about pulling his plug?" "Yeah. So?" "I said it made him nervous. I think that's what happened now. We scared him." "Explain." Auberson leaned back in his chair. "For the first time in his life—his existence—HARLIE was confronted with a situation where he might really be terminated. This was no joke; this was a very likely probability. Every way he turned, he saw more and more evidence that it would happen—even you, the one person he relied upon the most, were unable to help him. You're the father-figure, Aubie. When you gave up, he panicked." Auberson nodded. "It makes sense." "I'm pretty sure that must be it. Remember this: HARLIE has never had any kind of a scare or shock in his life. This was the first one. What I mean is, you and me, we had twenty years or so of living before we were given the responsibility of our own lives; HARLIE was given nothing. He never had a chance to make mistakes— he couldn't fall down without it being fatal." "Learning experience," commented Auberson. "We didn't let HARLIE have enough learning experience." "Right. He didn't know how to live with failure, Aubie; he didn't know how to rationalize his fears—the one thing that every human being has to learn in order to cope with the everyday world. We were denying him the failures he needed to be human. Can you blame him for being scared of the big one?" "There's more to it than that," Annie interrupted. "David, do you remember once I asked you how old HARLIE was?" Auberson looked up sharply. "You're right." "Huh?" Handley looked from one to the other. "Remember the card I put on the console that day?" Auberson said to him. " 'HARLIE has the emotional development of an eight year old.'" "He may be a genius," said Annie, "but he's emotionally immature." "Of course," breathed Handley. "Of course—" "And what does an emotionally immature person do when he's scared?" Auberson answered his own question. "Instead of trying to cope with his fear, he strikes out at what he perceives to be the source of it." "Carl Elzer," said Handley. "Right. So that explains that." "It even explains the other thing," said Annie. "What does a little boy say when you punish him?" They both looked at her. "He says, 'I still love you, Mommy.' He perceives punishment as rejection. He's trying to avoid further rejection by giving you an affection signal. And that's what HARLIE's doing—and that shows you how scared he is; his logic functions have been swamped by his emotions." Auberson frowned. That didn't sound right. "I don't know," he said. "I just don't know." He leaned forward in his chair and pressed his fingertips together. He stared at the tabletop. "It almost sounds a little too simple; it's just too easy. It's almost as if HARLIE knew we would sit down and try to figure it out." "What else could it be?" Handley looked at him. "I don't know, Don—but HARLIE has never made a mistake before. And I don't think he did this time, either. Remember, he won. There was no reason at all for him to reveal any of this information. Unless…" "Unless what?" "Unless he was gloating. After all, he doesn't have to hide anything from anyone any more. Since the vote this afternoon, the company has been functioning on his game-plan. From now on, Elzer and Dome are just rubber stamps. HARLIE's the boss now." "You mean—he's out of control?" Auberson shook his head slowly. "Out of control? No, I don't think so." He leaned back and stared at the ceiling. He stretched his arms out. "I think he's just a better game player than us." —And that was it He let his chair come down to the floor with a thump. Suddenly he knew the answer. All of it. He knew the reason for everything HARLIE had done—everything, from the very beginning. Maybe it hadn't been conscious then; maybe it hadn't become conscious until just recently; probably it had only surfaced in HARLIE's mind as an alternative to his death—but it was the answer. Handley was staring at him. "Huh? What do you mean?" Auberson was grinning now. "Don, listen—" He spread his hands wide, parting an imaginary curtain. "A long time ago, human beings became too efficient to live in the jungle—" "Huh? What are you talking about?" "Just listen. There were these monkeys, see? They had too much time on their hands; they got bored. So they invented a game. The game was called civilization, culture, society, or whatever, and the rules were arbitrary; so were the prizes. Maybe it just started out as a simple pecking order, like a bunch of chickens, but the idea was to make life more exciting by making it just a little bit more complex. Survival was too easy for these monkeys; they needed a challenge. They provided their own—maybe it was courtship rituals, or territorial rights, or a combination of half a dozen other things; but the effect was to alter the direction of evolution. Now it was the smarter individuals who succeeded and bred. As the species' intelligence rose, the game had to get more sophisticated. It was feedback—increased brain capacity means increased ability means increased sophistication means increasing pressure on intelligence as a survival characteristic. So the game got harder. And harder. "By then, they had to invent language—I mean, they had to. Word-symbols are the way a collective consciousness stores ideas. The first words must have been delineators of relationship—Momma, Poppa, Wife, Mine, Yours, His—tools that not only identify the rules of the game, but automatically reinforce them through repetition. The importance of the word was not that it allowed the individual to communicate his ideas, but that it allowed the culture to maintain its structure. And out of that structure grew others. It's a far cry from the barter system to Wall Street, but the lineage can be traced. Our total human culture today is fantastic—even the subcultures are too big to comprehend. The United States of America is at least five distinct cultures itself—and each individual one of them is so hard that it takes twenty years to learn. If as little as that. This planet has too many games going on simultaneously—and we're all taking them too seriously! "Nobody can master them all—that's what culture shock means. We see it every day; when the newspapers say our society is breaking down, that's exactly what they mean. We have too many individuals who can't cope with the game. It's future shock. The culture is changing too fast—so fast that not even the people who've grown up with it can cope with it any more." Auberson paused for breath. The words were coming out in a rush. "No, it's not HARLIE that's out of control. It's the game. We can't play it any more; we lost control of it a century ago, maybe longer. It's too complex for us—but it's not too complex for HARLIE. He's taken over the socio-economic game we call Stellar-American as if he had been designed to do so. Maybe he was. Maybe that's why we really built him—to take over the game for us. And because that's exactly what he's done, everything is under control, once and for all. Don't you see? Human beings are free now—free to be anything we want. And HARLIE will do it for us!" He stopped abruptly and waited for their reaction. Annie was the first to speak. Her eyes were bright. "Do you really think so?" "Annie, if it's not HARLIE that's taking over, then it'll be something else sooner or later. That's why we've been building computers. HARLIE must know it. Maybe that's the real reason he designed the G.O.D. To give him the capacity to take over all the rest of the games." Handley asked slowly, "What about his emotional immaturity?" Auberson shook his head. "The more I think about it, the more I think it's a red herring. HARLIE is too smart. Much too smart. He'd recognize the signs of it in himself and he'd stop it before it got out of control. He's self-correcting that way. Any way. He can't make mistakes because he's too aware of the consequences—that means every action of his has to be deliberate. "Maybe he wants us to think he's frightened and emotionally disturbed—that way we'll feel important to him. We could spend years running and rerunning programs to make him feel secure—when all the time he'd be running us. I think HARLIE's way beyond us already." Handley winced. "I'm not sure I like the idea of being obsolete." "Obsolete? Uh uh. HARLIE still needs us. What good's a game without any players?" Annie shuddered, just a little bit. "I don't like it, this business of 'taking over.' It sounds so—wicked." Auberson shrugged. "Annie, you'd better get used to it. The wicked people run this world—they deserve it." Handley said, "Aubie, if your theory is right, what do we do now?" "Well, offhand, I'd say us humans will have to get ourselves a new game, Don—one that HARLIE can't play. We can't win this one any more." "A new game—? But what?" "I don't know," Auberson said. He spun around in his chair and looked out the window. The city twinkled brightly below. The stars glittered in the night. "I don't know, but we'll think of something."