My guess is that a lot of predictions → you’re wrong sometimes → it feels like you’re wrong a lot. Contrasted with not making many predictions → not being actually wrong as much. If so:
1) Percent incorrect is what matters, not number incorrect. I’m sure you know this, but it could be difficult to get your System 1 to know it.
2) If making a lot of predictions does happen to lead to a high percentage incorrect, that’s valuable information to you! It tells you that your models of the world are off and thus provides you with an opportunity to improve!
The more time I spend hanging out with rationalists the less comfortable I am making predictions about anything. It’s kind of becoming a real problem?
“Do you think you’ll be hungry later?” “Maybe”
: /
How are you with decisions? Which after all are the point of all this rationality.
“How about eating here?” ”...?”
Sorry to hear that :(
My guess is that a lot of predictions → you’re wrong sometimes → it feels like you’re wrong a lot. Contrasted with not making many predictions → not being actually wrong as much. If so:
1) Percent incorrect is what matters, not number incorrect. I’m sure you know this, but it could be difficult to get your System 1 to know it.
2) If making a lot of predictions does happen to lead to a high percentage incorrect, that’s valuable information to you! It tells you that your models of the world are off and thus provides you with an opportunity to improve!