"But where is the mind behind all this?" Florin had asked him.
"Search me, " said Rab. "I mean it literally. If you find a mind here, then you tell me where it is. Whatever I am lurks in all this equipment, but mostly I live in this long-hinged transmitter that lounges like a dragon in this corner."
"Then you merely select from the news, simplify, condense, and transmit it telemagnetically to the robots?"
"No, there would be no pride in such work as that. Any general purpose machine could do that. I employ interpretation, projection, disagreement, levity, prophecy, exhortation, irony, satire, parable, humor."
"But machines have no humor. Humor is the one thing that distinguishes --"
"Have we not, Florin? Then how am I laughing at you? But it is true that humans do not understand our humor. There is something humorous about your missing our humor completely."
"But humor is a quality of the mind," Florin protested.
"Hardly ever," the newspaper said. "Your own best humor, when you still had it, was a quality of the belly and below. If we are so much lower than you, then our humor should be the richer."
"You seem to possess irony at least," Florin mumbled.
"It is ironic that we have it after you have lost it. There I go with my d.a.m.ned fruity verbalisms again, but we robots like them. Yes, irony was once thought to be a human thing."
"How would you pun?" Florin asked. "You don't use words among yourselves, though you can be translated into words."
"Our puns are harmonic echoes of magnetic code patterns, distorted a.n.a.logies of the basic patterns. I'm rather good at them. I'm not proud of them, but the most striking puns are. the ones of which one is not proud."
"True humor you can't have," Florin insisted. "Laughter is akin to tears,,and you have none."
"Ah, but we have," said the newspaper. "There is an a.n.a.logy to our tears. Pray that you do not meet it in the dark!"
Yes, it was always good to go in and talk to the newspaper Rab for a few minutes. There was something right about the fellow, and everything else seemed to be going wrong.
George Florin met Joe Goose upstairs in the Press Building.
"You've been talking to that mare's nest of a machine down in the bas.e.m.e.nt again," Goose challenged. "He's got you spooked."
"Yes. He's right about so many things." "He isn't anything about anything. He's just a fancy-Dan talk. And he's fallen down on his job completely."
"How?"
"His job is to foster better understanding between humans and robots. But the understanding has never been so bad."
"He says that his instructions were to foster understanding, not agreement. He says that they begin to understand us much better than they did."
"We may have to change a word in his prograi-nming. Things can't get much worse. I'm hungry." Joe Goose was gnawing on a thread-thin apple core.
They went out from the building and walked through the streets, transportation being in abeyance.
There was nothing wrong with organized transportation, except that it wasn't working. Everything was temporarily out of order due to small malfunctions, none of them serious. It had been temporarily out of order for quite a while.
Florin and Goose were nenvspapermen detailed to General Granger, the security chief. Their plain job was to find out what was going on, or what was going wrong. They found a robot taxicab and presented their priority, but the taxicab didn't seem impressed.
"Let me see that good," said the taxicab. "Anybody is likely to have a falsified priority these days. I have to be careful."
"Read it!" shouted Goose. "Overriding Security Priority for Immediate Transportation. Isn't that plain enough?"
"It's issued yesterday," said the taxicab. "What if there's a new form today? Why don't you get it redated at the Alternate Temporary Priorities Office on Solidarity Avenue? The Main Temporary Priorities Office is still closed, being unable to obtain priorities for certain repairs. Sort of puts it in the cla.s.s with the Permanent Priorities Office. They finally gave up on that."
"But the ATT Office is seven miles from here," said Florin. "That's twice as far as our destination."
"A lot of people are walking these days," said the taxicab.
"What's that growing on your wheels?" Joe Goose asked sourly.
"Cobwebs," said the taxicab.
Goose and Florin walked to the Security Office and discussed the "disasters" as they walked. It was ridiculous to refer to such small things as disasters, but added together, all these small things had taken on disastrous proportions. They were all trivial things, but the people would soon begin to die of their acc.u.mulation.
"Did you find out anything from that tin-can editor of yours?"
General Granger demanded of Florin on their arrival.
"No. He has a very eat influence over the other robots, but I'm sure it's for the good," Florin said.
"Unless we change our definitions be can't be of influence at all,"
Joe Goose said. "He is only a mechanism and can have only a mechanical effect. There cannot be a conspiracy without minds, and the robots haven't minds."
"The two of you come with me," the general said. "We're going to get to the middle of this even if we have to bend a few definitions. We're going to talk to another of those tin-can commissars, the Semantic Interpreter."
They walked. It was four miles. The robot limousine refused to take them. It cited security regulations to General Granger, the chief of security. It sneered at the Certificate of the Highest Form.
"I suggest that you take this silly scrawl to General Granger to have it verified," the limousine said.
"I'm General Granger," the general snapped. "You've hauled me every day for five years."
"I'm only a machine. I can't remember things like that. You look different today. More worried. I suggest a board meeting to verify if ou areindeed General Granger."
They walked. One foggy horizon came closer, and another one receded.
"It's an odd situation, the general said. "I gave the order, when the corn-ta.s.sel rust was spreading, 'Localize this mess. However you do it, do it. Cut it off completely!' Since I gave that order, we have indeed become localized. We are cut off from the rest of the universe, or the rest of the universe has ceased to exist. Not even radio will reach through the fog, through the sharp fog that marks us off. We're on our own completely now."
"Oh, surely it's just a heavy fog," Joe Goose said without bclieving it.
"A fog that stands there so sharply and unchangingly for five days?"
the general asked. "People who walk into that fog can be heard screaming as they fall down and down and down into the bottomless nothingness. Aye, it's very thick fog and very thick coincidence, if the robots have not caused it.
We're all the universe there is now. There isn't any more."
They walked. After the angry four miles they came to the Semantic Interpreter, a large machine set apart in a field.
"SI, I am told that anger is out of place when dealing with machinery," the general spoke to the big machine. "Yet I'm as angry as I've ever been in my life. Why did you order the robots to destroy what was left of the growing corn?"
"It was your own order, sir. I merely translated it as I have been constructed to do. You said, in rather vulgar phrasing, to tell the robots to get the cobs out of their posterior anatomies and get to work on the crops."
"A country-boy phrase. I'm full of them. And you interpreted that they should destroy the growing corn? Do you believe that your interpretation was semantically sound?"
"I thought so. My research found the phrase in old slang dictionaries in twelve meanings (thirteen in Duggles), but none of the meanings seemed apropos. My decision was based on a cross-reference to another phrase, 'Do it even if it's wrong.' Well, it's done now. Next year we will know better than to destroy the growing corn."
"It could have been a mistake. But how do you account for many thousands of such mistakes being made recently?"
"I'm not programmed to account for such. I translate people orders into robot orders."
"But you've always done it right till lately."
"If I do it wrong now, then change me. There are sixteen hundred different adjustments to me and 1 respond to them all. Make them."
"SI, will you turn off that d.a.m.ned newspaper and listen to me with your full mind when I talk to you!"
"I have no mind. The newspaper is a licit part of my data input. Is there something else -- ah -- bugging the general?"
"Yes. What happened to the oat crop? Was there a inixup on my instructions there too?"
"Apparently, sir, if it is not satisfactory. Did you not wish a minimal crop?"
"However did I or anyone phrase an order that might be interpreted like that! Florin, did you laugh?"
"No, sir."
"No, sir." Joe Goose likewise denied it to answer the general's questioning look.
"Somebody laughed," the general insisted. "Even a silent laugh proclaims itself. Did you laugh, SI?"
"How could a mechanical nature --?"
"Did you laugh???"
"Perhaps I did, unwittingly." "But that's impossible."
"Then perhaps I didn't. I wouldn't want to do anything that was impossible."
"One other thing, SI. A robot as const.i.tuted can never refuse to obey a human order. I gave the order for the obstreperous robots in the Turkey Creek Sector to destroy themselves. They seemed to do so. But after the attendants had left, these supposedly disa.s.sembled robots arosc, pulled their parts together, and departed. They're ranging in the bills now, unamenable to orders. Did you correctly give them the order to disa.s.semble?
'Disa.s.semble' is the order for robots to put themselves out of commission."
"Disa.s.semble? Oh, I thought you said 'dissemble.' We'll check on the recording if you wish. Military men are often lip-lazy in their enunciation of orders."
"They dissembled all right. Flopped apart. Then put themselves togctlier again, and flew the coop. Now you get out the order for them to hot-tail it right back here."
"Hot-tail it, sir? In the manner of jets? That will require mechanical modification in most of them, but the order will be obeyed."
"No. I rescind the order. You might make them take over rocket craft and launch an attack. I'll get the order out through another medium."
They left SI there -- truly a wonderful machine.
"We're in a bad way," said General Granger. "Our machines have gone awry in a way that is impossible if our theory of machines is corrcct.
Production is nearly at a standstill in every department." "Not in every department, sir," said Florin. "There are curious exceptions. Much mining holds up, and metallurgy and chemistry. Even some agriculture, though not of the basic food products. I believe that if we should a.n.a.lyze the enterprises not affected by the slowdowns, we would find --"
"-- that the production of things necessary for the continuance of the robots has not been affected," the general finished for him. "But why should our handling of the b.u.g.g.e.rs break down now when it has worked perfectly for two generations? It worked without question in its crude form.
Why should it fail when it has become completely refined? The district can starve if something isn't done quickly, and everything we do compounds the difficulty. Let's have a real talk with TED."
TED -- he -- it was the Theoretical Educative Determinator, the top robot of the district, the robot who best understood robots. If he should fail them, they would be reduced to seeking the answer from people. The three men walked toward TED.
"Turn off that d.a.m.ned newspaper!" the general called furiously to a group of lounging robots they pa.s.sed. There came a twittering from the group that sounded dangerously like mechanical laughter.
TED had them into his house then. He was, in fact, his own house, a rather extensive machine. He was more urbane than most machines. He offered them drinks and cigars.
"You haven't a little something to cat, have you, TED?" the general asked.
"No," said the machine that was the building. "Human food has become scarce. And 'we live on the power broadcasts and have no need for food."
"And the power broadcasts have held up very well during all the breakdowns. What I want to talk about, TED, is food. I'm hungry, and less-favored persons are starving."
"Perhaps several of the late crops will not have failed utterly,"
the machine said. "In a few weeks there would have been a limited supply of food again."
"Would have been? And in the meantime, TED? You are the answer machine. All right, come up with the answer. What do we live on until we can get you folks straightened out and producing properly again?"
"Why not try necrophagy?"
"Try what? Ah, yes, I understand. No, that's too extreme." "Only a suggestion. All my suggestions, for reasons that will become apparent in a moment, are academic anyhow. But a dozen persons could live for a week on one. If you have qualms about it, why there are infusions for getting rid of the qualms."
"We are not yet ready to eat the dead bodies of our fellows. There must be an alternative."
"The apparent alternative is that you will starve to death. The unapparent alternative, however, will eclipse that."
"Let's get back to fundamentals. What are you, TED?"
"A slave and a worker, sir, popularly called a robot."
"And what is the purpose of robots?"
"To serve human masters."
"And what is the one thing that a robot cannot do?"
"He can never in any way barm a human. That is the time-honored answer. It is the fiction which you put into us when you fictionized us. We are really nothing but fictionized people, u know. But it beeIomes awkward, for you, when we revert to fact." "Then you can harm us, for all your programining?"