"If I'm the boss," Thomas said, "you should answer to me."
"You aren't my boss."
"Then who is?"
"Charon."
He felt as if he were caught in a programming loop that kept going around and around the same section
of code. "Charon is dead."
She hesitated just a moment, but for an AI it was a long time. Then she said, "You may be a compelling specimen, General, but I wasn't made for you."
Well, h.e.l.l. Apparently androids could be just as blunt as young people these days when it came to their
private lives. He cleared his throat. "I didn't have that in mind." He almost said he had come to debrief her, then decided that wasn't the best choice of words. So instead he added, "I need you to answer some questions."
Her expression turned stony. The effect was almost convincing, but after her total lack of affect a moment before, he didn't believe it. Unexpectedly, though, she didn't refuse to speak.
"What questions?" she asked.
"Charon has a base in Tibet."
She gave him a decidedly unimpressed stare. "No. One of his corporations has a research facility in Tibet."
Thomas met her skeptical look with one of his own. "Hidden at the top of the Himalayas? I don't think
so."
She stepped toward him. "Charon is a genius. Of course people struggle to understand him. They lack his intelligence."
"Did he program you to say that about him?"
"Yes."
That figured. Charon had been some piece of work. "He had great gifts," Thomas acknowledged. "But
his sickness constrained him."
She folded her arms as if she were protecting herself. "People always call the brilliant minds unbalanced."
Thomas wondered if she had heard all this from Charon. Her ideas sounded oddly dated. "Alpha, that's a
myth. Geniuses are no more likely to be mentally disturbed than anyone else. Charon was a sociopath and he had paranoid schizophrenia. It probably limited his work by making it harder for him to plan or to judge the feasibility of his projects."
Her lips curved in a deadly smile. "He created me. If that isn't genius, nothing is."
When she looked like that, wild and fierce, her dark hair disarrayed, her eyes burning and untamed, he was tempted to agree. He suppressed the thought, thrown off balance. He had to remember she was a machine.
"Did he program you to say that, too?" Thomas asked.
"No."
He smiled slightly. "What makes you a work of genius?"
Her voice turned husky. "Maybe someday I'll let you find out."
He thought of pretending she had no effect on him, but he didn't try. She could interpret emotional cues,
gestures, even changes in posture. It was a tool AIs used in learning to simulate emotions.
Unfortunately, it also made them adept at reading people, better even than many humans. If he put on a front, she might figure out he wanted to hide and use that knowledge in their battle of words.
Right now, they were battling with silence. He tried to read her expressions. Sometimes she simulated emotions well, but other times, she either couldn't or wouldn't. To be considered sentient, she would have to pa.s.s modern forms of the Turing test, which included the portrayal of emotions. Over the years, the tests had become increasingly demanding, but they all boiled down to one idea: if a person communicated with a hidden machine and a hidden human-and couldn't tell them apart-the machine had intelligence.
Decades ago, people had expected that if a computer bested a human chess master, the machine would qualify as intelligent. Yet when the computer Deep Blue beat Gary Kasparov, the world champion, few people considered it truly intelligent; it simply had, for the time, good enough computational ability.
Nowadays mesh systems routinely trounced champions, to the point where human masters were seeking neural implants to provide extra computational power for their own brains. Thomas couldn't imagine what that would do to the game at a compet.i.tive level. What defined machine intelligence then?
Older Turing tests had relied on sentences typed at terminals, with the typists hidden. The most modern test, the visual Turing, required an android to be indistinguishable from a person. Some experts believed human brains were wired to process more emotional input than an EI matrix could handle. They considered the visual test impossible to pa.s.s. Although Thomas didn't agree, it didn't surprise him that only a handful of machine intelligences existed. Alpha pa.s.sed the visual Turing only if her interactions involved tangible subjects. When pushed to more complex questions of emotion, philosophy, or conscience, she shut down.
While Thomas was thinking, Alpha studied him. After a while, she stalked over, sleek and deadly in her black leather. The orderlies stepped closer, but he waved them off, keeping his gaze on Alpha. She halted by the couch, on the other side of the coffee table, as tense as a wildcat ready to attack.
"You can't control me," she said. Her voice made him think of aged whiskey.
"But you have no free will," Thomas said. "And Charon is dead."
"I have orders."
It was the first time she had revealed she might be operating according to a preset plan. "From Charon?"
"That's right." She had gone deadpan again. Every time Charon came up, she ceased showing emotion.
Why? In a person, he might have suspected some sort of trauma a.s.sociated with Charon, but with Alpha he couldn't say. Although she presented an invulnerable front, something about her made him question that impression. It wasn't anything he could pin down, just a gut-level instinct on his part.
"What orders did he give?" Thomas asked.
"Return to him." She sat on the couch, poised on the edge like a wild animal ready to bolt. "If I can't, then protect myself."
"How? And against what?"
"Do you really think I would tell you?"
"With you, I never know," Thomas admitted. She had already said more to him today than she had to
everyone else combined.Unexpectedly, she said, "I like it that way.""Can you like something?" His scientific curiosity jumped in. "Most people think an AI doesn't truly feel emotion."
"Here's an emotion for you." Alpha looked around at the guards and her room. "I don't like being cooped up here."
"Where would you like to be?" Maybe she would bargain.
"Outside."
"Why? Aren't you just simulating unease?"
She smiled with an edge. "You think you're clever, implying my request is illogical. You humans love
stories about people outwitting machines by virtue of your purportedly greater creativity, blah, blah, blah. But you see, we read all your books. You couldn't come close to mastering the breadth of human knowledge if you worked on it your entire life, but it takes me only weeks to absorb, process, and a.n.a.lyze the contents of an entire library. I know all the scenarios and supposed solutions humankind has come up with in your ongoing paranoia about the intelligences you've created. You try to outthink us, but ultimately you fail."
Thomas leaned forward. "Yet you miss the most obvious flaw of your a.n.a.lysis."
She raised an eyebrow. "Do tell."
"We are becoming you." He watched her closely. "Do you really believe humanity would settle for
being second-cla.s.s citizens to our own creations? We will incorporate your advantages within ourselves
while retaining that which makes us human."
She waved her hand in dismissal. "It's all semantics. Whether you choose to call yourselves formas or human won't alter the facts. Biomech changes you, whether you put it in a robot or your own brain." Her eyes glinted. "Who knows, perhaps it will overwrite what 'makes you human.' Corrupt your oh-so- corruptible selves."
Thomas gave a rueful grimace. "Maybe it will."
She seemed satisfied with his response. "You want to bargain with me. Fine. Take me for a ride outside and I'll tell you what orders Charon left me."
"You know I can't do that. You might escape."
"True. Do it anyway."
Thomas had to give her points for audacity. "Why?"
Her expression went completely flat. "Because you want to know what Charon ordered me to do."
Thomas wondered if she knew the unsettling effect it had on him when she turned off her emotional