II: The Everest Project
Megan hadn’t expected her security clearance to come through so fast. MindSim must have begun the paperwork in advance. That was certainly optimistic. Or maybe they were just covering all their bases. In any case, after a few weeks of negotiations, they flew her from Massachusetts to California for a tour of their labs.
She felt like a kid in a game arcade. Visiting MindSim was far better than the “hot times” her friends urged on her for fun, like parties or holovids. Invariably, her parents joined the chorus, with hints that she should include a fellow in the proceedings, son-in-law material, of course. Their lobbying drove her crazy. They were wonderful people and she loved them dearly, but she felt like running for the hills every time they got that grandparental gleam in their eyes.
Tony and Claire showed up in person to escort her through the snazziest labs. In one, spindly droids trundled around, navigating obstacle courses with remarkable agility. Megan spent half an hour putting them through their paces before her hosts enticed her to another lab. There she met an appliance that resembled a broom with wheels and detachable arms. The robot spoke at length about how it could move its fingers with more strength and dexterity than a human being.
They went for a walk with a two-legged robot that had a gait so smooth, it put to shame earlier versions that had jerked along like stereotypical machines. Her hosts also let her try a Vacubot. Its inventors deserved an award for their gift to humanity, a robot that could vacuum the house perfectly even as it called the nearest pizza joint to bring dinner for its humans.
“We also work on humanlike robots,” Tony said as they ushered her down another hall. “This next lab is where our people design the body.”
Megan’s pulse jumped. Humanlike was the current buzzword for androids. “Do you have one here?”
“Unfortunately, no.” Claire avoided her gaze. “This work is theoretical. Development would go on elsewhere.”
So. They didn’t want to talk about the actual state of their R&D. No surprise there. Industrial espionage in robotics was a thriving enterprise. MindSim wouldn’t make their results public until they had full patent protection and copyrights. She already had a preliminary security clearance with them, but they probably wanted to see her responses first before they decided how much more they wanted to reveal about the work.
She wondered what they did to protect the AI brains their people created. You couldn’t copyright or patent a human brain, after all, though no doubt MindSim wished they could for their most talented scientists, to keep them from using their abilities elsewhere. Soon humanity would have to answer the question: When did self-modifying software become a cognizant being with rights under the law?
The next lab enticed her like a bakery full of chocolate cake. Equipment filled it, all cased in Lumiflex, a glowing white plastic. Instead of whiteboards, the walls sported photoscreens with light styluses. Disks and memory cubes cluttered the tables, and towers stood by the consoles. A few cables ran under the floor, but most of the connections were wireless. A wall counter sported a coffeepot and a wild assortment of mugs.
Two men and a woman were working at the consoles. They had gorgeous workstations: Stellar-Magnum Mark-XIV comps; combination FAX, cell phone, FAX, radio, microphone, camera, and wireless unit; keyboards both virtual and solid; printer, scanner, a streaming unit for online music; and state-of-the-art holoscreens that projected some of the best rendered images Megan had seen. Holos rotated in the air with views of the theoretical android: EM fluxes, circuits, skeleton, hydraulics, temperature profiles, and more.
It reminded Megan of her first day in college. While her friends had checked out dance clubs in the city, she had spent the afternoon meeting grad students in the AI lab. Within a week, she had talked her way into an assistant-to-an-assistant job with their professor. That summer, he gave her a research job. By her sophomore year, the group considered her a member of their circle. Sure, she knew why Tony and Claire had shown her the glitzy labs first instead of this one. She had only holos to see here, no working models. But if she took the job, these people would be her team, and they interested her far more than any glitz.
Tony introduced them. The slender man with sandy hair was Fred from Cal Berkeley. The other fellow, Miska, came from a university in Poland. Diane, a stout woman with auburn hair, had done a stint at a government lab and then taken this job.
As they described their work, they referred to the android as “he.” At first Megan appreciated that they didn’t say “it,” but then she wondered. Already they were giving their creation human attributes. Maybe the android wouldn’t want those traits. Someday humans would probably download the neural patterns of a human brain into a machine, but even then no guarantee existed that the resulting machine would think or act human.
Their descriptions also sounded too detailed for a hypothetical model. Finally Megan said, “It’s done, isn’t it? You have a working android.”
They all just stared at her. Fred glanced at Claire. When she nodded to him, he turned back to Megan. “I’m afraid ‘working’ is too optimistic a term.”
Tony indicated a table. “Let’s sit down. Now that you’ve seen the models, we can talk about where we hope to go from here.”
The good stuff. As they took their seats, Fred brought over mugs of coffee. When everyone was settled, Claire spoke to Megan. “We’ve tried to make several prototypes. Four.”
Miska took a sip of coffee, then grimaced and set his mug down. He spoke with a light accent. “The problem, you see, is that these androids become mentally unstable. The bodies have problems, yes, but we can fix these. We are not so sure about the mind.”
“The first three failed,” Diane said. “We still have the fourth, what we call the Everest android, but he barely functions.”
“Everest?” Megan asked.
“A bit lofty, eh?” Tony said. “The project is named for surmounting a great height.” He leaned forward. “It could be yours. Your successes, your triumphs.”
Yeah, and her failures. “What happened to your last director?”
Fred spoke flatly. “He quit.”
Tony frowned, but he didn’t try to spin Fred’s words. That notched up Megan’s respect for MindSim.
“Marlow Hastin directed the project until a few months ago,” Diane said. “It was a mess, to be honest. The first android, the RS-1, became catatonic. No matter what we tried, it stayed that way. The RS-2 had similar problems, with autism. And the RS-3 . . . well, it killed itself.”
Megan stared at her. “An android chose to commit suicide?”
“It looked that way,” Miska said. “He walked into a furnace and burned up.”
“Is that why Hastin quit?” Megan asked.
“In part,” Claire told her. “But he didn’t leave until later.”
Megan glanced Claire. “Why didn’t you take over the project?” The Stanford prof would be a great choice.
“I have too many other contracted commitments,” Claire said. “That’s why I recommended you.”
As much as it flattered Megan to have that respect from such a noted scientist, she still didn’t trust this. “What finally spurred Marlow Hastin to leave?”
Diane spoke awkwardly. “We had a difference of opinion.”
Fred set down his mug. “Marlow wanted to program subservience into the RS units. He thought that if we didn’t, they might turn against us.”
“It’s a valid concern,” Megan said. “But it may be moot. We’re combining ourselves with our creations as fast as we can make the results viable and safe. If we become them and they become us, the issue goes away.”
The others exchanged glances.
“You are much different from Hastin,” Miska said.
She regarded them curiously. “Different how?”
“He hated the idea of taking our technology into ourselves,” Diane said. “Or of putting our minds into robots.”
Megan snorted. “What, we should turn down a pacemaker? Or an artificial leg if we lose ours? We’re creating the means to make ourselves smarter, stronger, faster, longer lived.”
“In the ideal,” Claire said. “Whether or not we achieve it remains to be seen.”
“Our hope,” Tony said, “is to explore the full potential of humanlike robots.”
Megan spoke carefully. “Including peaceful applications?” She understood the need for defense work and she believed in working to protect her country, but she also wanted the fruits of her intellect to go toward improving the human condition.
“Of course,” Tony said. “We’re committed to both.”
Both. So he knew what she meant. Megan sat thinking. This Everest group struck her as a good team. However, they were missing an important component, someone as experienced in the hardware as she and Claire were in writing code for the android’s AI brain. That was Hastin’s area of expertise, and Hastin was gone.
“Who is your robotics expert?” Megan asked.
Fred cleared his throat. “Well, yes, that’s the rub.”
“It’s a top priority,” Tony interjected smoothly. “If you accept the position, we’ll have a slate of superb candidates for you to consider.”
“In other words,” Megan said, “you don’t have one.”
“We’re taking the time to find the best,” Tony assured her. “We almost had a fellow from Jazari International, but JI came through with a counter-offer and he decided to stay.”
She wasn’t surprised they had checked out JI. The company had risen to prominence over the past two decades. She had met Rashid al-Jazari, the CEO, several times. His American wife, Lucia del Mar, performed with the Martelli Dance Theatre, so they and their three children lived part of the year in the United States, and Rashid sometimes visited MIT. He didn’t strike her as the type to let MindSim woo away his top people.
She thought back to her talk with Raj. “How about Chandrarajan Sundaram?”
“We’re trying,” Claire said, “But we aren’t the only ones. Arizonix also wants him.”
Tony the VP said only, “Arizonix,” but he managed to put boundless distaste into that one word.
“Are you sure you’d want Sundaram?” Claire asked her. “He has a reputation for being rather difficult.”
Fred snorted. “He’s a nut.”
“I rather like him,” Megan said.
“You’ve met him?” Diane asked.
“We talked at the IRTAC meeting. It was interesting.”
“I’ll bet.” Claire sipped her coffee, then blanched and set her mug down with the care one used when handling explosives.
Curious, Megan tried the brew. It went down like a jolt of TNT and detonated when it hit bottom. “Hey. This is good.”
Fred gave a hearty laugh. “A truly refined taste.” Claire and Miska turned a bit green.
They spent the next hour showing her details of their work. She made no promises, playing it cool.
But she was ready to jump.
III: Nevada Five
The hovercar skimmed across the Nevada desert like a ship sailing an ocher sea, the rumble of its turbofan evoking growling sea monsters. Sitting in the front passenger seat, Megan gazed out at a land mottled with gray-green bushes. They were following no road, just hovering over the flat ground.
Since passing the security check several miles back, they had seen no building or other vehicles. The isolation unsettled her. As the new director of the Everest Project, she would live here. She still had to wrap up her work at MIT and direct her grad students, but she could do most of that from the Nevada base using online and virtual reality conferences.
She glanced at Fred, who sat in the driver’s seat. He, Diane, and Miska had come out to introduce her to the project. A second car followed, bringing Major Richard Kenrock, their main contact at the Department of Defense, and a lieutenant who served as his assistant. After today, the rest of the Everest team would work in California. With satellite links, she could easily communicate with them, and she could use robots for lab technicians. If her team had been doing only a development study, she would have stayed at MindSim, but to direct this project, she needed to interact with the android, up close and personal.
The car whooshed across the desert on its cushion of air, rocking a bit from the terrain, its turbine providing thrust and vectored steering. Megan would have preferred a traditional car, which cost far less, but they were more limited in the terrain they could traverse. The hovercar was better suited to this isolated region, a place with no roads, nothing that might make the area accessible. The people in charge of security for the Everest Project spared no effort in making their base a pain to reach.
The car slowed to a stop and settled to the ground, the baritone of its landing motor a grumbling contrast to the tenor of the turbofan. They were in the middle of nowhere. Nothing but gravelly land and spiky plants stretched in every direction. The second car settled next to them, with Richard Kenrock driving. The major waved, making it look like a salute.
Fred peered at a screen on the dash. “Okay. This is it. Backspace, take us down.”
Backspace, the car’s computer brain, spoke. “Fingerprint code, please.”
Fred touched the screen. With no ado and almost no sound, the land under them sank into the desert. It reminded Megan of cartoons from her childhood, where a trapdoor opened beneath unsuspecting characters and they dropped out of sight with their long ears streaming above them. This platform went much slower, fortunately, like a freight elevator enclosed by a sturdy wire mesh. From above, it had been impossible to see. Holographic camouflage hid all hint of its existence.
She opened her window and craned her head to look down, preferring the real thing to the images on the dashboard screens. A garage waited below and lamps lit the area, activated by the car. Several vehicles already crouched there, dark humvees with angular bodies.
When the elevator reached the floor, the mesh around them opened like a gate. After their two vehicles drove into the garage, the gate closed and the elevator rose back up to the desert floor.
Megan climbed out of the car and glanced around. “Those Humvees look like giant stealth cockroaches.”
Fred gave one of his hearty laughs. “I guess you could say the place is bugged.”
They left their cars next to the vehicular bugs and walked through the cool garage. Its stark functionality didn’t reassure Megan. She would be living here.
Her doubts eased when they left the garage and entered a pleasant hall with ivory walls and a blue carpet. A robot was waiting for them, what MindSim called a Lab Partner. It stood about six feet tall, with a tubular body, treads for feet, a rounded head, and various detachable arms. The nameplate on its chest said “Trackman.”
“Welcome to NEV-5,” Trackman said. “I hope you had a good trip.”
“It was great.” Megan peered at Trackman the LP. So this was one of the ambulatory assistants that staffed NEV-5. Robots could manage the day-to-day operations. Automated systems both here and at MindSim monitored the base in case anything came up that needed human intervention, but in theory NEV-5 could operate without a human presence. She preferred to leave the accuracy of that theory untested for the multi-billion dollar installation.
Trackman escorted them to the elevators and Megan walked at his side, taking in the pleasant surroundings. The pale walls even sported starkly beautiful paintings of the desert. From what she understood, NEV-5 had three levels. The garage, power room, and maintenance areas were here on Level One. Living areas were one floor down, on Level Two, and the labs filled Level Three.
“Do you enjoy working at NEV-5?” she asked the LP.
“Enjoyment isn’t one of my design parameters,” Trackman said.
She decided to poke his coding. “Define enjoyment.”
“Amusement. Entertainment. Pleasure. Recreation. Zest.” Then he added, “Those are in alphabetical order.”
“Would you like to experience amusement?” she asked. “Pleasure? Zest?” In alphabetical order, no less.
“I have no need to do so.”
Oh, well. If Trackman was the best NEV-5 had to offer for company, aside from a barely functional android, she was going to be on Skype a lot, talking to her friends. Maybe she should reprogram Trackman for better conversation. It was a poor substitute for human fellowship, though, not to mention a waste of the LP’s resources.
Up ahead, a droid rolled around a corner where the hall turned left. About the size and shape of a cat, its “legs” were tubes that sucked in dust and dirt. As it came up to them, Megan crouched down and touched its back. It stopped with a jerk. She poked it again, and the droid scuttled back a few feet. When she reached out and tapped its leg, it buzzed with agitation.
“I won’t hurt you,” Megan murmured. She stood and walked around the droid. It waved its tail, probably trying to judge if the bedevilment was going to continue. When she nudged it from behind with her foot, it sidled past the humans who had invaded its territory and whirred away down the hall.
“That was a shy one,” she said.
“Cleaning droids have no capacity for shyness,” Trackman said. “You were blocking its path. It has less efficient means than an LP to map its environment.”
Megan sighed. “Thank you, Trackman.”
“You are welcome.” If it detected her irony, it gave no indication.
They started off again, Fred walking on the other side of the LP. “Trackman,” he asked, “did Marlow Hastin’s family live here with him?”
“No,” Trackman said. “His wife visited sometimes.”
From behind them, Major Kenrock said, “I doubt his kids had the clearance to come here.”
Megan wondered whether or not the solitude had bothered Hastin. If she had been married, she would have rather lived in a nearby city with her family and commuted to this base. The rest of the Everest team had their lives and families in California, so they would live out there. Although being single made matters simpler for her, it also left her more isolated.
Trackman showed them the living areas in Level Two. Megan liked the apartments, with their blue carpets, glossy consoles, plush armchairs and sofas, and airbeds covered by downy blue comforters. One room with wallpaper patterned by roses and birds especially appealed to her. She decided to take it as her quarters, but she said nothing, self-conscious about choosing her personal space in front of other people.
Then they went to meet the android.
The RS-4 had “slept” during the past few weeks while the Everest team reassessed itself. Two of the LPs looked after the android and had activated him to greet Megan. When her group entered the office where android waited for them, her pulse leapt. This was it.
The RS-4 was sitting at a table. Even knowing what to expect, she froze in the doorway. As far as she could see, he was physically indistinguishable from a human man. He could have been a boyish version of Arick Bjornsson. It didn’t surprise her. Bjornsson had consulted on the project several years ago, and he and several others had donated their DNA to the genetic bank used to create tissue for androids. The Everest team had grown parts of the RS-4 from Bjornsson’s DNA.
Even with his Nordic features, blue eyes, and yellow curls, the android wasn’t an exact copy of Arick. Tall but not too tall, with boy-next-door looks, he came across as pleasant and nonthreatening, someone you might not even notice in a crowd. No hint showed of the microfusion reactor that powered his body, the bellows that inflated his lungs, or the pumps drove lubricant through his conduits. His “organs” would age over centuries rather than decades and remain disease-free.
He regarded Megan with no expression. Did he know he was a weapon? Her job was to develop a super-soldier and spy. To succeed as a covert operative, the RS-4 had to be convincing as a human being. Outwardly, he could pass that test now. If a doctor gave him a cursory exam or if he went through something like an airport security check, nothing would give him away. But a more demanding look would show the truth. His “blood” was a lubricant. Any detailed scan of his interior would reveal he was synthetic. Her team still had a lot of work to do.
They didn’t want him too human, though. He would have the power and memory of a computer, the creativity and self-awareness of a person, the training of a commando, and the survival ability of a drone. He would be smarter, faster, stronger, and harder to kill than any human soldier.
In the long view, MindSim had commercial hopes for the RS line. If humans could augment or replace their bodies with android technology, they could achieve phenomenal abilities, enhanced intellects, and longer lives. The process had begun in the twentieth century: replacement joints, limbs, bones, and heart valves, synthetic arteries and veins, artificial organs, and neural prosthetics. It was Megan’s dream that someday a self-evolved humanity would step beyond the urge to war and other ills that plagued their species. An idealistic dream, perhaps, but still hers. The Everest Project offered a first step.
Trackman brought her inside the room. Major Kenrock and his lieutenant waited with Fred by the door, discreet, staying back. Diane and Miska settled in armchairs near the table where the RS-4 sat, close enough to answer questions Megan might have, but not too close to intrude on her first meeting with the android. An LP stood behind the RS-4 like a guard, and Megan had the oddest notion, as if the LP were protecting its brother from this strange infestation of humans.
She sat at the table and spoke to the RS-4. “Hello.”
“Hello.” His voice had no life, less human even than Trackman.
“My name is Megan O’Flannery. I’m the new chief scientist.”
“Echo told me,” the RS-4 said.
“Who is Echo?”
He indicated the LP behind him. “That is Echo.
That. Not he or she. Humans tended to refer to robots as male or female, based on the robot’s voice and appearance. She knew she shouldn’t be disappointed at the android’s lack of affect, but she couldn’t help wishing for more.
“Do you have a name?” she asked.
“Shall I call you Aris?” Hastin had named him Aris Fore.
“If you wish.”
“Are you comfortable?”
“I am operational.”
She supposed “operational” was better than no response at all. “Aris, do you feel anything about this? By ‘feel’ I mean, do you have any reaction to Dr. Hastin’s departure and my arrival?”
His flat response didn’t surprise her. In Hastin’s research notes, he had made no secret of his frustration with the android’s inability to interact as a person. Hastin was actually the third chief scientist on the project. He had quit, but MindSim had fired the first two.
“Can you simulate emotions?” she asked the android.
“Yes.” His eyes were beautiful replicas of human eyes—with no sign of animation.
“Why aren’t you simulating any now?” she asked.
Could have fooled me. “Can you smile?”
His mouth curved into a cold, perfect smile. It looked about as human as a car shifting gears.
“Get angry at me,” Megan suggested.
“I have no context here for anger.”
At least he knew he needed a context. “What emotion do you think would be appropriate for this context?”
He spoke in a monotone. “Friendly curiosity.”
“Is that what you’re doing?”
“Yes. I am pleased to meet you.” He might as well have said, The square root of four is two.
She tried another tack. “Do you have any questions you would like to ask me?”
Well, she had known she had work ahead of her. “Let’s take a walk around NEV-5. You can show me places you remember, tell me what you know about them.”
He stared at her.
After a moment, she said, “Aris?”
Someone swore under his breath. Glancing up, Megan saw Fred coming over to the table.
“What is it?” she asked.
Fred stopped next to the RS-4. “He hangs that way if he can’t handle a question.”
That didn’t sound promising. “He can’t handle something as simple as ‘let’s take a walk?’”
Miska answered her. “Pretty much not.”
Fred laid his hand on the android’s shoulder. “Aris? Can you reset?”
Aris remained frozen, staring past Megan.
“We could restart him,” Fred offered.
“No. Not now.” Megan stood up. “I’ll come back later, after I’ve seen the rest of the facility.” In other words, when she was by herself. No obvious reason existed for Aris to “care” who watched him interact with her, but she wanted to find out if he responded differently when they had more privacy.
She spoke at Echo, the LP. “Make him comfortable.”
“I will ensure the RS-4 suffers no damage,” Echo said.
That isn’t what I meant. But she said nothing. What could she do, tell a robot not to treat another machine like a machine?
Aris’s bedroom had nothing on its ivory walls. It had no furniture. No console. No bric-brac, mementos, or reading material. Zilch. Megan and Aris stood in the middle of an empty space. At least they were alone. Ever since yesterday, when she had arrived at NEV-5, either Echo or Trackman had come with her whenever she went to see Aris. The humans had all left, but the LPs continued to follow her around. Finally she had barred them all from this room. She wanted nothing to distract Aris.
She set a shoebox on the floor. “Can you see that box?”
He looked down. “Yes.” The cameras in his eyes were integrated so well into his design that she detected no difference between his glance and a human gaze.
She gave him an encouraging smile. “Jump over it.”
Aris didn’t move. As he contemplated the box, Megan unhooked a jCube from a belt loop of her jeans. She had named the cube’s AI “Tycho” in honor of a famous astronomer. Using Tycho’s wireless capability, she linked to Aris’s brain, giving herself a window into the android’s thoughts.
He had an incredible mind. His databases of facts and rules about the world were gigantic. Communication mods let him converse in more than one hundred languages. He “thought” with neural webs that networked his body. Those webs included not only software code, but also “wetware” neurons designed from nano-filaments. Each neuron received signals from pieces of code, external input devices, or other neurons. If the input exceeded a specified threshold, the neuron sent its own signal, to other neurons, to other sections of code, or to an output device. Aris learned by altering his neural and coded responses. Those responses he deemed positive caused him to strengthen the links that led to them. Bad results weakened the links.
Although he couldn’t physically alter his internal structure, he constantly rewrote his software. He used many methods to evolve his code. Most relied on “sex chromosome” algorithms, which allowed him to copy sections of code and combine them into new code, often with changes that acted like mutations. It was survival of the fittest: code that worked well reproduced, and code that didn’t died off.
A simulated neuron could operate faster than its human counterpart, but putting millions of them together became resource intensive and slowed Aris down, at least initially. He couldn’t yet match the speed of human thought because he had too much learning to do with every interaction. If all went well, however, his speed would dramatically increase as he matured.
Right now he just stared at the box on the floor. According to Tycho, he was calculating various trajectories he could use to make the jump. After his nets learned the process, he would no longer need to solve the equations every time, no more than a child had to work out trajectories when she jumped. He hadn’t yet reached that stage. Even with his untutored nets, though, Megan didn’t see why it was taking so long. He should only need seconds to translate the math into commands for his body.
She probed deeper into his code. For some reason, he had switched to a module that expressed fear. She tried to unravel how that had happened, but his continually evolving code was impossible to follow.
“Aris?” she asked. “Can you jump?”
He continued to stare at the box.
“Tycho,” she said, “what is the highest level of fear Aris can tolerate before he freezes?”
“It varies.” Tycho spoke in a well-modulated contralto. “He uses an array of values to determine what will immobilize him. That array contains over one hundred variables.”
An emotion was beginning to show on Aris’ face. He looked frustrated, like a toddler stymied by a puzzle. It reminded Megan of her two-year-old nephew. She held back her smile. Although she doubted Aris could have hurt feelings, his brain might have developed more than she realized, besides which, the less she thought of him as a machine, the more she could help him develop responses that appeared human.
She spoke into her cube. “Why is he frozen?”
“It’s an element in his fear array,” Tycho said. A holo-display formed above her cube showing Aris’s fear array as a three-dimensional grid. One cube in the center glowed red.
“If the red element goes above six percent,” Tycho said, “it stops him from moving.”
“Six percent?” No wonder. It gave him an absurdly limited tolerance to fear. “Are all the elements set that low?”
“The values range from two to forty-three percent,” Tycho said. “The average is sixteen.”
“That’s appalling.” She didn’t see how he could function with such stringent limitations on his behavior. “Aris, are you still receiving input from me?”
No answer. He just stared at the box.
She tried again. “If you can hear me, try this: use your logic mods to analyze the safety of your situation.” He could easily calculate that he had no reason to fear the jump.
Outwardly, nothing happened. As she studied the displays created by her cube, however, she realized Aris was shifting some of his processing power to a logic mod. His logic response kicked in, trying to make him jump, but his fear persisted, conflicting with the logic. That branched him into an anger mod, which sent him to a fight mod. The fight code kicked him into a parachuting mod, due to some convoluted interpretation of her request that he jump. So now his brain wanted him to throw himself out of a plane in the sky.
“I need an aircraft!” His voice exploded out. “How can I jump without one?”
“Exit the jumping mod,” Megan said. She had no idea if he could respond to nuances in her voice, but she gentled her tone.
Aris kept staring at the box. Controlled by his anger mod, his body pumped fluids to his face and raised the temperature of his skin. His cheeks flushed red, making him look like a furious boy. A spike of yellow hair was sticking up over his ear as if to protest his ignominious situation.
An idea came to Megan. “Do you know how to do a parachute landing fall? It’s what jumpers practice on the ground before they go up in a plane.”
He neither answered nor moved. His face turned redder.
Watching him tugged at her as if he were a child. Her voice softened. “You don’t have to jump.”
“Aris? Can you hear me? Don’t jump.”
He stared at the box as if it were a monster that had broken the rules of nightmares and come out from under his bed into broad daylight.
She didn’t want to reset him. It bothered her to wipe his brain that way, besides which, he would lose some of what they had done. However, she had to free him from his frozen state.
“Tycho,” she said. “Reset the RS.”
“I can’t,” Tycho answered. “He’s protected from resets.”
It made sense. Aris could never become independent if people could reset his mind. However, as his primary coder, she needed access.
“Check my retinal scan,” she said.
A light from the cube flashed on her face. “Retinal scan verified.”
“Okay. Do the reset.”
“Done,” Tycho said.
Aris’s face went blank. Then he focused on her. “Hello.”
“Are you all right?” she asked.
“Do you remember what happened?”
“You asked me to jump over the box.”
“And that frightened you?”
“No.” Although almost a monotone, his voice had a trace of vibrancy he hadn’t shown yesterday. “Your command caused my code to exceed certain tolerances, which stopped my actions and prompted me to mimic behaviors associated in humans with anger and fear.”
She smiled. “I guess you could put it that way.”
“Do you wish me to put it another way?”
“No.” That intrigued her, that he asked her preference.
“Do you still want me to jump?” he asked.
“Not now. I need to reset your tolerances. That means I’ll have to deactivate you so that your code isn’t evolving while I’m making changes.” She spoke with care, unsure how he would respond to being turned off.
He just looked at her. At first she thought he had frozen again. Then she realized he had no reason to answer. Unlike a person, who would have reacted in some way, he simply waited.
“We can use one of the apartments,” she said. Although it wouldn’t cause him discomfort to lie here on the floor, the thought of asking him to do so bothered her. He could lie on the couch or a bed in one of the furnished rooms.
He continued to look at her.
“If you understand a person,” she said, “it’s customary to indicate that in some way.”
“Nod. Smile. Make a comment.” Didn’t he know? “Your knowledge base must have rules for social interaction.”
“I have many rules.”
“Don’t they indicate how you should respond?”
She waited. Finally she said, “But?”
A hint of animation came into his voice. “You are new.”
“So you don’t know what parameters apply to me?”
“You should apply all your rules with everyone.”
“Very well. I will do so.”
She smiled. “Good.”
They left the room and headed down a hallway. As they walked, she said, “Do you have any hobbies?”
“I don’t engage in nonfunctional activities.”
“We’ll have to change that.”
He turned his gaze on her. “Why?”
“It’s part of having a personality.”
“What nonfunctional activity should I engage in to have a personality?”
Megan almost laughed. “Haven’t you ever done anything besides interact with the Everest team?”
“I make maps.” A tinge of excitement came into his voice. “I made one of NEV-5 for Dr. Hastin. I tried to make one of MindSim, but I didn’t have enough data.”
It seemed a good activity. “Do you like doing it?”
“I don’t know how to ‘like.’”
“Would you do it even if you didn’t have to?”
She beamed at him. “Great. I’ll see if I can find you some map-making programs.” It was a start. Aris had a hobby.
Megan took him to a bachelor apartment in the residential section of the base. This place was much nicer than his empty room, with holos of mountains landscapes on the walls. The airbed looked comfortable, pulled high with a white comforter and heaps of pillows.
“You know,” she mused, “you could live here.”
He looked around at the room. “I agree.”
She hadn’t expected him to respond. “Why?”
“It is less sparse than my current apartment. It will prod my coding to evolve more.”
“Good.” She was glad to see him reason through a choice. And he was right, he could use sensory input, just as a human child benefitted from surroundings with colors, books, and music.
“You can lie on the bed while I work on you,” she said.
Aris lay on his back with his legs straight out and his arms at his side.
She sat next to him. “Does it bother you to be deactivated?”
“Why would it bother me?”
“It’s like becoming unconscious.”
“I have no context for a response to that state.”
That made sense, at least for now. Eventually, he would develop a context. What he would do with that context remained to be seen.
“Dr. O’Flannery,” he said. “Should I call you Megan?”
Startled, she smiled. “Yes. That would be good.”
“Are we going to engage in sexual reproduction activities?”
Good grief. “No, we are not going to engage in sexual reproduction activities. Whatever gave you that idea?”
“You told me to apply my rules about social interactions. According to those, when a woman sits with a man on a bed in an intimate setting, it implies they are going to initiate behaviors involved with the mating of your species.”
“Aris, if we were going to, uh, initiate such behaviors, we would have done some sort of courtship.” Whatever else happened with this project, she doubted it would be boring. “We haven’t, nor would it be appropriate for us to do so.”
“For one thing, you’re an android.”
“None of my rules apply to human-android interactions.”
“Here’s one,” she said. “Reproductive behaviors are inappropriate in this situation.”
“I have incorporated the new rule.” He paused. “I see it would be impossible for us to mate anyway, since I will be turned off.”
Turned off? As opposed to “turned on”? Could he have just made a joke? No, it couldn’t be. It was just his deadpan delivery.
“You may deactivate me now,” he said.
A chill ran down her back. What happened on the day when he said, You may not deactivate me?
Aloud, she said, “BioSyn?”
“Attending.” Although the resonant voice rose out of the console here in the room, it originated from a powerful server in the biggest lab on Level Three. BioSyn linked to all of the NEV-5 computers and monitored Aris’s activities.
“Deactivate Aris,” Megan said.
“Done,” BioSyn answered.
Aris’s eyes closed. His chest stop moving, and when she laid her hand on his neck, she found no pulse. It was unsettling, given how well he simulated life when he was running.
She spoke into her jCube. “‘Tycho, link to Aris.”
Tycho went to work, helping her analyze the android’s quiescent brain. The coding was too complex for a human to untangle alone; she needed the help of another computer. At least with Aris deactivated, his mind was no longer a moving target, evolving even as she analyzed him.
It didn’t take her long to see why he kept freezing. His fear tolerances weren’t the only ones set too low. Hastin had put so many controls on his behavior, Aris was incapable of independent thought. She understood Hastin’s intent: he wanted to ensure they didn’t lose control. But his precautions were so stringent, they crippled the android’s ability to develop.
Yet despite all those protections, the codes that determined Aris’s moral behavior were astonishingly weak. It made no sense; if Hastin feared Aris might act against his makers, why give him such an underdeveloped ethical sense? The android did have a conscience; that was hardwired into his structure and couldn’t be removed. But his software influenced how strongly he adhered to his sense of right and wrong. At the moment, his mind had almost no pathways that allowed him to reason through moral judgments.
Gradually she made sense of the work. Aris was a spy. He needed to deceive, manipulate, steal, even kill. Hastin had given him a solid foundation in human morals, then set it up so Aris could act against them. Aris knew it was wrong to kill, but he could commit murder if he felt it was necessary to do his job. However, Hastin had then so severely limited Aris’s ability to make any such decisions, the android couldn’t function.
The problem was obvious, but not the solution. Aris couldn’t deal with the contradictory moral questions humans faced. Stymied by the ambiguities in his purpose and coding, he froze. To help him function, she would need to raise millions of the caps on his behavior, particularly those for his responses to the negative stimuli that caused fear, anger, danger, and violence. If she didn’t also strengthen his aversion to acting on those stimuli, she could create exactly the monster Hastin feared. Deepening his moral code would do the trick, but it would also pulverize Aris’s ability to carry out his intended purpose as a weapon.
What a mess. No wonder Hastin had resigned.
Megan knew what she had to do. It remained to be seen whether or not MindSim would fire her for those decisions.