天下书楼
会员中心 我的书架

Chapter 93

(快捷键←)[上一章]  [回目录]  [下一章](快捷键→)

r. daneel olivaw still spoke quietly, but it seemed to seldon that there was a subtle change in his voice, as though he spoke more easily now that he was no longer playing a part.

"in twenty thousand years," said daneel, "no one has guessed i was a robot when it was not my intention to have him or her know. in part, that was because human beings abandoned robots so long ago that very few remember that they even existed at one time. and in part, it is because i do have the ability to detect and affect human emotion. the detection offers no trouble, but to affect emotion is difficult for me for reasons having to do with my robotic nature--although i can do it when i wish. i have the ability but must deal with my will not to use it. i try never to interfere except when i have no choice but to do so. and when i do interfere, it is rarely that i do more than strengthen, as little as i can, what is already there. if i can achieve my purposes without doing even so much, i avoid it.

"it was not necessary to tamper with sunmaster fourteen in order to have him accept you--i call it tampering, you notice, because it is not a pleasant thing to do. i did not have to tamper with him because he did owe me for favors rendered and he is an honorable man, despite the peculiarities you found in him. i did interfere the second time, when you had committed sacrilege in his eyes, but it took very little. he was not anxious to hand you over to the imperial authorities, whom he does not like. i merely strengthened the dislike a trifle and he handed you over to my care, accepting the arguments i offered, which otherwise he might have considered specious.

"nor did i tamper with you noticeably. you distrusted the imperials too. most human beings do these days, which is an important factor in the decay and deterioration of the empire. whats more, you were proud of psychohistory as a concept, proud of having thought of it. you would not have minded having it prove to be a practical discipline. that would have further fed your pride." seldon frowned and said, "pardon me, master robot, but i am not aware that i am quite such a monster of pride."

daneel said mildly, "you are not a monster of pride at all. you are perfectly aware that [it] is neither admirable nor useful to be driven by pride, so you try to subdue that drive, but you might as well disapprove of having yourself powered by your heartbeat. you cannot help either fact. though you hide your pride from yourself for the sake of your own peace of mind, you cannot hide it from me. it is there, however carefully you mask it over. and i had but to strengthen it a touch and you were at once willing to take measures to hide from demerzel, measures that a moment before you would have resisted. and you were eager to work at psychohistory with an intensity that a moment before you would have scorned.

"i saw no necessity to touch anything else and so you have reasoned out your robothood. had i foreseen the possibility of that, i might have stopped it, but my foresight and my abilities are not infinite. nor am i sorry now that i failed, for your arguments are good ones and it is important that you know who i am and that i use what i am to help you.

"emotions, my dear seldon are a powerful engine of human action, far more powerful than human beings themselves realize, and you cannot know how much can be done with the merest touch and how reluctant i am to do it."

seldon was breathing heavily, trying to see himself as a man driven by pride and not liking it. "why reluctant?"

"because it would be so easy to overdo. i had to stop rashelle from converting the empire into a feudal anarchy. i might have bent minds quickly and the result might well have been a bloody uprising. men are men--and the wyan generals are almost all men. it does not actually take much to rouse resentment and latent fear of women in any man. it may be a biological matter that i, as a robot, cannot fully understand.

"i had but to strengthen the feeling to produce a breakdown in her plans. if i had done it the merest millimeter too much, i would have lost what i wanted--a bloodless takeover. i wanted nothing more than to have them not resist when my soldiers arrived."

daneel paused, as though trying to pick his words, then said, "i do not wish to go into the mathematics of my positronic brain. it is more than i can understand, though perhaps not more than you can if you give it enough thought. however, i am governed by the three laws of robotics that are traditionally put into words--or once were, long ago. they are these:

" one. a robot may not injure a human being or, through inaction, allow a human being to come to harm.

" two. a robot must obey the orders given it by human beings, except where such orders would conflict with the first law.

" three. a robot must protect its own existence, as long as such protection does not conflict with the first or second law.

"but i had a ... a friend twenty thousand years ago. another robot. not like myself. he could not be mistaken for a human being, but it was he who had the mental powers and it was through him that i gained mine. "it seemed to him that there should be a still more general rule than any of the three laws. he called it the zeroth law, since zero comes before one. it is:

" zero. a robot may not injure humanity or, through inaction, allow humanity to come to harm.

"then the first law must read:

" one. a robot may not injure a human being or, through inaction, allow a human being to come to harm, except where that would conflict with the zeroth law.

"and the other laws must be similarly modified. do you understand?"

daneel paused earnestly and seldon said, "i understand." daneel went on. "the trouble is, hari, that a human being is easy to identify. i can point to one. it is easy to see what will harm a human being and what wont--relatively easy, at least. but what is humanity? to what can we point when we speak of humanity? and how can we define harm to humanity? when will a course of action do more good than harm to humanity as a whole and how can one tell? the robot who first advanced the zeroth law died--became permanently inactive--because he was forced into an action that he felt would save humanity, yet which he could not be sure would save humanity. and as he became inactivated, he left the care of the galaxy to me.

"since then, i have tried. i have interfered as little as possible, relying on human beings themselves to judge what was for the good. they could gamble; i could not. they could miss their goals; i did not dare. they could do harm unwittingly; i would grow inactive if i did. the zeroth law makes no allowance for unwitting harm.

"but at times i am forced to take action. that i am still functioning shows that my actions have been moderate and discreet. however, as the empire began to fail and to decline, i have had to interfere more frequently and for decades now i have had to play the role of demerzel, trying to run the government in such a way as to stave off ruin--and yet i will function, you see.

"when you made your speech to the decennial convention, i realized at once that in psychohistory there was a tool that might make it possible to identify what was good and bad for humanity. with it, the decisions we would make would be less blind. i would even trust to human beings to make those decisions and again reserve myself only for the greatest emergencies. so i arranged quickly to have cleon learn of your speech and call you in. then, when i heard your denial of the worth of psychohistory, i was forced to think of some way to make you try anyway. do you understand, hari?"

more than a little daunted, seldon said, "i understand, hummin."

"to you, i must remain hummin on those rare occasions when i will be able to see you. i will give you what information i have if it is something you need and in my persona as demerzel i will protect you as much as i can. as daneel, you must never speak of me."

"i wouldnt want to," said seldon hurriedly. "since i need your help, it would ruin matters to have your plans impeded."

"yes, i know you wouldnt want to." daneel smiled wearily. "after all, you are vain enough to want full credit for psychohistory. you would not want anyone to know--ever--that you needed the help of a robot."

seldon flushed. "i am not--"

"but you are, even if you carefully hide it from yourself. and it is important, for i am strengthening that emotion within you minimally so that you will never be able to speak of me to others. it will not even occur to you that you might do so."

seldon said, "i suspect dors knows--"

"she knows of me. and she too cannot speak of me to others. now that you both know of my nature, you can speak of me to each other freely, but not to anyone else."

daneel rose.--hari, i have my work to do now. before long, you and dors will be taken back to the imperial sector--"

"the boy raych must come with me. i cannot abandon him. and there is a young dahlite named yugo amaryl--"

"i understand. raych will be taken too and you can do with any friend as you will. you will all be taken care of appropriately. and you will work on psychohistory. you will have a staff. you will have the necessary computers and reference material. i will interfere as little as possible and if there is resistance to your views that does not actually reach the point of endangering the mission, then you will have to deal with it yourself."

"wait, hummin," said seldon urgently. "what if, despite all your help and all my endeavors, it turns out that psychohistory cannot be made into a practical device after all? what if i fail?"

daneel rose. "in that case, i have a second plan in hand. one i have been working on a long time on a separate world in a separate way. it too is very difficult and to some ways even more radical than psychohistory. it may fail too, but there is a greater chance of success if two roads are open than if either one alone was.

"take my advice, hari! if the time comes when you are able to set up some device that may act to prevent the worst from happening see if you can think of two devices, so that if one fails, the other will carry on. the empire must be steadied or rebuilt on a new foundation. let there be two such, rather than one, if that is possible."

he rose, "now i must return to my ordinary work and you must turn to yours. you will be taken care of."

with one final nod, he rose and left.

seldon looked after him and said softly, "first i must speak to dors."

先看到这(加入书签) | 推荐本书 | 打开书架 | 返回首页 | 返回书页 | 错误报告 | 返回顶部