I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.
Well, then why does he say self-delusion is impossible? It’s not only possible, it’s usual.
I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.