Eliezer: Pretty sure that if I ever fail to give an honest answer to an absurd hypothetical question I immediately lose all my magic powers.
I just cannot picture the intelligent cognitive process which lands in the mental state corresponding to Eliezer’s stance on hypotheticals, which is actually trying to convince people of AI risk, as opposed to just trying to try (and yes, I know this particular phrase is a joke, but it’s not that far from the truth).
I think the sequences did something incredibly valuable in cataloguing all of these mistakes and biases that we should be avoiding, and it’s kinda gut-wrenching to watch Eliezer now going down the list and ticking them all off.
I think Eliezer realizes internally that most of his success so far has been due to his unusual, often seemingly self-destructive honesty, and that it’d be a fraught thing to give that up now “because stakes”.
I just cannot picture the intelligent cognitive process which lands in the mental state corresponding to Eliezer’s stance on hypotheticals, which is actually trying to convince people of AI risk, as opposed to just trying to try (and yes, I know this particular phrase is a joke, but it’s not that far from the truth).
I think the sequences did something incredibly valuable in cataloguing all of these mistakes and biases that we should be avoiding, and it’s kinda gut-wrenching to watch Eliezer now going down the list and ticking them all off.
I think Eliezer realizes internally that most of his success so far has been due to his unusual, often seemingly self-destructive honesty, and that it’d be a fraught thing to give that up now “because stakes”.