“Why ‘undesirable’?” Mike asked.
“Since he is free from sin—either original or actual—he is therefore filled with the plenitude of God’s grace. The purpose of a sacrament is to give grace to the recipient; it follows that it would be useless to give the Sacrament to Snookums. To perform a sacrament or to receive it when one knows that it will be useless is sacrilege. And sacrilege is undesirable.”
“Brother! But I still don’t see how that makes him dangerous.”
“The operation of the First Law,” Leda said. “For a man to sin involves endangering his immortal soul. Snookums, therefore, must prevent men from sinning. But sin includes thought—intention. Snookums is trying to figure that one out now; if he ever does, he’s going to be a thought policeman, and a strict one.”
“You mean he’s working on telepathy?”
She laughed humorlessly. “No. But he’s trying to dope out a system whereby he can tell what a man is going to do a few seconds before he does it—muscular and nervous preparation, that sort of thing. He hasn’t enough data yet, but he will have it soon enough.
“There’s another thing: Snookums is fouling up the Second Law’s operation. He won’t take orders that interfere in any way with his religious beliefs—since that automatically conflicts with the First Law. He, himself, cannot sin. But neither can he do anything which would make him the tool of an intent to sin. He refuses to do anything at all on Sunday, for instance, and he won’t let either Fitz or I do anything that even vaguely resembles menial labor. Slowly, he’s coming to the notion that human beings aren’t human—that only God is human, in relation to the First and Second Laws. There’s nothing we can do with him.”
“What will you do if he becomes completely uncontrollable?”
She sighed. “We’ll have to shut him off, drain his memory banks, and start all over again.”
Mike closed his eyes. “Eighteen billions down the drain just because a robot was taught theology. What price glory?”