I’m also missing the ability to estimate. Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right. I always get that sort of thing wrong. Arithmetic estimation is even harder. Deciding how to bet in a betting game? Next to impossible.
Whatever mechanism is that matches theory to reality, mine doesn’t work very well. Whatever mechanism derives expectations about the world from probability numbers, mine hardly works at all. This is why I actually can double-think. I can see an idea as logical without believing in it.
I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.
Congratulations. You’re just like most humans.
Well, then why does he say self-delusion is impossible? It’s not only possible, it’s usual.
I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.