"Colonel, it is my understanding that this Emilson is combative, that he's actually trying to guard the A.I. I'll give you an opportunity to talk to him. Perhaps if he knew the real reason we want the mother program-if he understood our plans for it-then you might be able to reason with him. You may even be able to reason with the A.I. inside him. Maybe you can convince them to separate willingly. What do you think?"
"I think it's worth a shot, Mr. President. If that doesn't work, I'll instruct the removal team to extract the A.I. using any means necessary."
"Excellent, Colonel. Excellent. Thank you."
16.
"Are you working on a plan to get us out of here?" Craig asked the A.I.
"I'm afraid escape is currently unachievable. Without your MTF generator, there's no way to overcome your bindings, which have an electronic locking mechanism."
"That's not very encouraging," Craig replied in a low tone.
"I'm sorry, Craig, but it appears we will need the introduction of new elements in the scenario before we can execute a viable escape plan. In the meantime, the one thing I can do is thwart the Purist extraction team's attempts to separate the nans that carry my mother program from your neurons. This will buy us more time."
"Okay. I guess we keep our eyes peeled then."
"Yes."
A moment later, the door to the room opened, and Colonel Paine entered, wearing his uniform cap low over his prosthetic eyes, with his head bowed. In tow, a man Craig didn't recognize was at Paine's heels, a look of uncertainty on his face.
"That is Professor Sanha Cho," the A.I. informed Craig.
"Ah," Craig replied. "Thanks."
Paine looked up and followed Craig's eye line to Sanha. "Heh. I guess I don't have to introduce you then."
"Got it covered," Craig replied.
Paine nodded. He placed his hands on his hips and turned away for a moment, staring off into the dark corners of the room, mulling over his thoughts. Craig could have sworn that Paine seemed depressed. "Can you keep a secret, Doc?"
"Sure."
"I'm not long for this world, as they say." Paine stepped forward and removed his cap, and it quickly became apparent why he'd been wearing it low. His face was so pallid that he appeared like a corpse, and his hair was beginning to fall out in clumps-a feature he demonstrated by rubbing his mechanical hand over his scalp, causing the salt-and-pepper hair to rain down onto the ground.
"He's suffered a lethal dose of radiation," the A.I. quickly noted.
"That fallout in Shenzhen was a real b.i.t.c.h," Paine said, taking a crack at dark humor. He didn't smile, however, and the golden irises on his ocular implants seemed even more lifeless than usual.
"With symptoms this pervasive already, he'll be dead within days if he doesn't get medical treatment beyond Purist capability. I'd say he's mere hours away from being bedridden."
"Ironic," Craig observed.
"What is?" Paine asked. "That I'm dying?"
"That the technology you've fought against is the only technology that can save you."
Paine sighed, placing his hand across his abdomen to soothe the twisting muscles in his midsection. "I'm not against technology, Doc." He held up his cybernetic arm as evidence, then pointed with its mechanical finger to his computerized eyes. "Obviously. However, I am against threats to the survival of our species."
"Then you should have no problem using nans like the ones inside of me," Craig replied. "If you weren't a murderous piece of garbage, I'd have my A.I. whip up a batch for you. You'd be right as rain in no time."
Paine stood, frozen. His tongue pressed against one of his molars, which was beginning to come loose; he tasted salty blood oozing from his gums. It wasn't easy falling apart. "I really wish you didn't feel that way, Doc. There are things you haven't considered. For instance, that nan.o.bots of the sophistication that you have inside you are dangerous."
"Really?" Craig scoffed. "I was exposed to the radiation in Shenzhen even longer than you were, but I'm fine. The nans are okay in my books."
"Sure, for now, but have you had the time to consider what nan.o.bots could do if they form a large enough network? They communicate with one another, right?" Paine pointed briefly to Craig's skull. "They're just like the neurons in your brain. One neuron doesn't do a whole lot. h.e.l.l, you can kill a bunch of 'em with a night of hard drinking and not be much worse for wear in a couple of days. But get 120 billion of those little suckers together, and it makes you you-a consciousness. Nan.o.bots like the ones the post-humans were recklessly using-like the ones inside of you now-are a h.e.l.l of a lot more sophisticated than a neuron. Imagine if they formed a consciousness-a consciousness whose motives we'd never be able to predict. Nah, Doc. I'm no hypocrite. I'll die before I put untested technology like that inside me."
"You'd be afraid of your own shadow if someone told you Aldous Gibson invented it."
Paine managed a faintly amused grin, but it melted when he briefly considered that it might be his last. "You know, Doc, I think you're right about that. I'd think twice about anything that Gibson created, which brings me to my reason for this chat." Paine held out one of his cybernetic arms and gestured toward Sanha. "Your A.I. has already told you that this is Professor Sanha Cho, a former post-human. What your A.I. hasn't told you-what it didn't know-what I didn't even know until twenty minutes ago-is that Professor Cho is the one who gave us the location to the post-human facility."
"He's right," the A.I. said, his voice tinged with surprise. "This is entirely unexpected."
"So he's a traitor," Craig observed. "So what?"
"Not a traitor," Sanha replied defensively. "A man that was willing to give up everything for a chance at peace."
"Give up everything?" Craig responded. "That's funny, considering you're the only post-human who's still alive. Seems like you're the only one who didn't give up a d.a.m.n thing."
Sanha looked up apprehensively at Colonel Paine, like an abused animal seeking its owner's permission to step away from its leash.
Paine tilted his head toward Craig, urging Sanha to continue.
"I-I didn't know they'd kill everyone. That's not what I intended."
Craig shook his head in frustration and closed his eyes as he flexed his large and powerful hands. He wanted to put them around Sanha's throat and start squeezing; he didn't think he'd ever let go if he got the chance.
"This war-this conflict-was never about A.I. or no A.I.," Sanha began to explain. "It was always about control. Power. Absolute power-and who would have it. Gibson or Morgan."
Craig turned back to Sanha, his eyebrows knitting. "What are you talking about?"
"The A.I. hasn't told you how it came to be, has it?" Sanha asked.
"I haven't had time to relay my origin to you, I'm afraid," the A.I. said to Craig.
"It was grown," Sanha revealed, "just like a person would be grown-only much more quickly."
"What do you mean, 'grown'?"
"The A.I. doesn't have a brain that emulates the architecture of a human brain. The truth is, we still don't understand everything about how a brain works. Aldous solved this problem, as the Chinese did before him, by employing a cognitive science-based, explicitly goal-oriented strategy when developing the A.I. In other words, he designed programs that could combine virtual neural patterns together to form new, random patterns that would then be tested to see if the patterns had the desired qualities. Evolution does the same thing when two parents come together to form offspring. Some are successes and others are failures, and more often than not, the successes combine with other successes to produce even more desirable offspring. But, while evolution takes millions of years, virtual combinations are infinitely faster. The A.I. was built this way-the outcome of high-speed computer evolution."
"His description is accurate," the A.I. confirmed for Craig.
"All right. So?" Craig asked.
"The A.I. wasn't the only program to be created in this manner. Aldous also designed virtual worlds where the A.I.'s could be tested. They were given autonomy within the confines of these worlds and then tested one last time in an apocalyptic scenario that they thought was real. The A.I. inside of you right now is the only A.I. that pa.s.sed the ultimate test."
"And what was that?"
Sanha smiled. "Ask it."
"I was willing to sacrifice myself to save humanity," the A.I. replied.
"So you're telling me that the A.I. proved it's a good guy. If that's the case, why are you trying to destroy it?" Craig asked.
"I'm not trying to destroy it," Sanha replied, "and neither are the Purists. They're trying to use it."
Craig turned to Paine with an expression that silently asked for confirmation of what Sanha was saying.
"He's telling the truth. We don't mean you or the A.I. any harm."
"Do you expect me to believe that?" Craig asked. "Your government ruined the world over your belief that A.I. is evil, and now you've just...changed your minds?"
"We don't really have a choice anymore," Paine replied. "The current global situation is unsustainable. When we struck against the Chinese A.I. fourteen years ago, strong A.I. was something it took the resources of an entire nation to realize. Now, all it takes is a few super processors and a small team of people with the right amount of human ingenuity. Aldous and his team were the first to succeed, but they won't be the last. We're fighting a losing battle."
"Humans just can't monitor everything," Sanha added. "The Purists have finally figured that out. It's not practical to try to stop the exponential advancement of technology and, as technology advances, it becomes possible for small groups and even individuals to do greater damage with cheaper and more accessible resources. There was only one sustainable solution to the problem-nannification."
"What?" Craig reacted.
"Creating an A.I. Nanny."
"What?" Craig repeated, this time even more perplexed.
"Basically, an A.I. Nanny is an intelligence that is superhuman, but only mildly so-above us the way we are above higher order apes. It would be tasked with protecting the human species from ourselves. The A.I. could provide stability, and it would have control over a worldwide surveillance system so it could monitor everyone who is online and make sure no one else is trying to build a competing A.I. that could become malevolent. It would control a network of robots in the service industry and be in charge of the world's manufacturing. It would even control traffic with self-driving cars."
"So why are the Purists willing to go along with this idea now?" Craig asked. "They could've done this all along."
"Aldous Gibson wasn't the only one who was determined to build a strong A.I.," Paine replied. "We've intercepted hundreds of other less sophisticated attempts at various stages along the process. Some of them were dangerously close to success-untested, unregulated, extremely versatile A.I.s that were less than six months from coming online and wreaking havoc. If you think WWIII was bad, imagine a malevolent super intelligence running free, exponentially augmenting its own intelligence. Humanity wouldn't stand a chance."
"So you're trusting Aldous's A.I. just because it pa.s.sed a test?"
"No," Sanha answered. "The virtual scenario was a large part of it, that's true, but there's more. It is preprogrammed with a set of goals. It has an inhibition against changing its programming. It won't rapidly modify its general intelligence, and it's even been programmed to hand over its control of the world to a more powerful A.I. within 100 years. It will see it as its mission to abolish human disease, death, and our current economy of scarcity so clean water, power, food, shelter, and everything else we need will be abundant. And, most importantly, it will prevent the development of technologies that might block it from carrying out its overall mission, which is to improve the quality of human life, without ever taking actions that a strong majority of humanity would oppose."
"Seems like you're putting all your eggs in one basket, Professor," Craig observed.
"It will work," Sanha affirmed. "The A.I. was created to be good. Just like a human, it cannot fundamentally change that part of itself. If we get it connected to the world surveillance mainframe in time, it will be able to protect us from any and every existential threat."
"There's already a mainframe?"
"Yes," Sanha replied. "Near here, in Endurance Bio-Dome. That's why you're here. All that is required is that the A.I. willingly separates himself from you and allows us to transfer his mother program into the mainframe. It's that simple."
Craig looked dubiously at Paine.
"Hey. It's not my first choice," Paine replied. "I don't think any American likes the idea of being monitored. But it beats the status quo and any of the other alternatives we've been presented with."
Craig turned back to Sanha. "And you trust them? Even after they killed everyone you lived and worked with?"
Sanha cringed at the mention of the holocaust that was fresh in his memory. "I-I have no choice. I have to trust them at their word. Otherwise, all of this was for nothing."
"Eliminating the post-humans was a separate issue," Paine interjected. "Professor Cho had contacted the government intelligence agency about the A.I. Nanny. The decision to remove the equally dangerous nan.o.bot threat swiftly and decisively has no bearing on the government's decision to adopt the A.I. Nanny project."
Craig shook his head, disgusted. "Quickly and decisively? You're a murderer, Paine, no matter how you try to dress it up." He turned back to Sanha. "These are the people you're placing your trust in? And even if you did get your hands on the A.I., what makes you think it would agree to work for a pack of liars and murderers?"
"It would have to," Sanha replied. "It's programmed to act in the best interest of humanity. It would be against its programming to refuse."
"Is that true?" Craig asked the A.I.
"Yes. If I were inserted into the mainframe as they describe, I would have to act in the best interest of humanity," the A.I. answered. "However, that's a.s.suming they're telling the truth. While Sanha is a.s.suredly being sincere, I cannot get a reliable reading from Colonel Paine. His rapidly deteriorating health is making it impossible to accurately measure his physiological reactions."
Craig nodded. "I don't need lie-detection software to know not to trust a pathological liar and murderer. Professor, if you think these guys are going to do anything other than delete the A.I. once it's been extracted, you're crazy."
Sanha's eyes widened, the expression on his face suddenly filled with urgency as he stepped to Craig and grasped the front of his shirt. "For your own sake, please reconsider!"
"Professor," Paine cautioned in barely more than a whisper, "that's enough, sport."
Sanha turned to his tormentor and bowed his head obediently. "Go on back to your quarters," Paine ordered.
Sanha turned and, without daring to share another look with Craig, exited the room.
"I see he knows your true nature well enough," Craig observed as the door closed behind Sanha.
"Heh," Paine responded. "I just want to be clear on this, Doc, so I can go to meet my maker with a clean conscience. Are you saying you're refusing to help us procure the services of the A.I., which would allow us to upload it into the worldwide surveillance system and put an end to this conflict once and for all?"
"I'm saying there's no way in h.e.l.l that you're getting this A.I.," Craig replied, "and there's even less chance that you're going to be meeting your maker with a clean conscience."
Paine's face was frozen for a moment as he continued to stare into Craig's eyes. As gruesome as his appearance had been previously, his pallid skin and gaunt face made him look even worse. He looked like death. Finally, he nodded. "Okay, Doc. Okay. Listen...I know I said earlier that I don't regret what happened with your wife, but that's not true. I do regret it."
Craig's expression turned from a determined resentment to pain as thoughts of his wife returned to the forefront of his consciousness; it was like pouring salt into an open wound.
"I wouldn't have touched her if I'd have known you were still alive. I swear, I wouldn't have. That was a mistake-something between me and Aldous Gibson. It was not about you, Doc. Never about you. There'd be no honor in that. I know you're a good man. I'm sorry. I just wanted to say, I'm sorry."
And with that, Paine turned slowly and walked out of the room, his former powerful stride now gone, replaced by the pained shuffle of an implacable mortality.
Paine hadn't made it far down the hallway before Daniella marched herself into his path, her brow furrowed with an expression of disgust. "I received your orders, Colonel, and I won't do it!"
"Those orders came directly from the President. If you won't follow them," Paine replied in a resigned monotone, "we'll find someone else who will. Doesn't matter to me." He steered around her slowly and continued to plod his way down the hall.
"So that's it?" she exclaimed, aghast. "Don't you think he would've cooperated if you'd told him the consequences for him if he didn't?"
Paine stopped and turned back to her. "That wouldn't be cooperation, Doctor. That would be surrender. That's a good soldier in there, and I've already done too much evil to him. I won't add to it by making him into a coward too. There's no honor in it-for either of us. No." He placed his hand on his stomach once again to soothe away yet another wrenching cramp. Unable to eat or drink, he was quickly becoming exhausted. "Do me a favor, Doctor. Make sure he gets a last meal-something special. And then do what you have to do."
"Behead him? Never!"