I said Harry represents an AI. You said Harry is an AI creator. I was just taking you at your word. It didn’t occur to me that you meant Harry represents an AI creator, because that wouldn’t make any more sense. Harry does not represent an AI creator unless he creates something that is in some way analogous to an AI. There is no hint he will do that. Harry is not creating an AI, and is not creating anything analogous to an AI.
Whereas Harry has properties analogous to a recursively self-improving AI on the way to having godlike power. And the danger that Harry will pose once he attains these godlike powers is one of the central themes of the fanfic, and is highlighted throughout the entire fanfic, over and over again.
I believe Eliezer would rather dip his… arm… in a vat of acid than suggest to his readers that an AI will act like a clever well intentioned child that learns valuable moral lessons through social interactions with those he loves, over a long period of a slow, rather inefficient quest for power.
The slowness is due to humans, who represent the AIs, operating only at human speeds, and does not detract from my point.
I still have the impression that you think Eliezer’s view implies that an AI cannot be well-intentioned, or learn valuable moral lessons through social interactions with those he loves. Why? That is exactly as absurd as saying that computers can never be intelligent. (I am ignoring questions of whether a computer program can subjective experience love, since “love” used as a verb refers to the action rather than to the subjective experience. But, still, anyone who follows Eliezer at all should be a materialist and therefore believe that a computer program can in principle love as well as a person can.)
I said Harry represents an AI. You said Harry is an AI creator. I was just taking you at your word.
Your interpretation is technically grammatically valid but not remotely reasonable.
I still have the impression that you think Eliezer’s view implies that an AI cannot be well-intentioned, or learn valuable moral lessons through social interactions with those he loves. Why? That is exactly as absurd as saying that computers can never be intelligent.
This impression, however, is not even a technically valid interpretation.
I said Harry represents an AI. You said Harry is an AI creator. I was just taking you at your word. It didn’t occur to me that you meant Harry represents an AI creator, because that wouldn’t make any more sense. Harry does not represent an AI creator unless he creates something that is in some way analogous to an AI. There is no hint he will do that. Harry is not creating an AI, and is not creating anything analogous to an AI.
Whereas Harry has properties analogous to a recursively self-improving AI on the way to having godlike power. And the danger that Harry will pose once he attains these godlike powers is one of the central themes of the fanfic, and is highlighted throughout the entire fanfic, over and over again.
The slowness is due to humans, who represent the AIs, operating only at human speeds, and does not detract from my point.
I still have the impression that you think Eliezer’s view implies that an AI cannot be well-intentioned, or learn valuable moral lessons through social interactions with those he loves. Why? That is exactly as absurd as saying that computers can never be intelligent. (I am ignoring questions of whether a computer program can subjective experience love, since “love” used as a verb refers to the action rather than to the subjective experience. But, still, anyone who follows Eliezer at all should be a materialist and therefore believe that a computer program can in principle love as well as a person can.)
Your interpretation is technically grammatically valid but not remotely reasonable.
This impression, however, is not even a technically valid interpretation.