But just as for many applications the performance bottleneck isn’t CPU speed, for most people the success bottleneck isn’t rationality.
Instrumental rationality, among other things, points people to whichever of their skills or abilities is currently the performance bottleneck and encourages them to work on that, not the thing that’s most fun to work on. So we would still expect instrumental rationalists to win in this model.
(Yes, epistemic rationality might not lead to winning as directly.)
Yes, epistemic rationality might not lead to winning as directly
Why would that be? Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns? Epistemic rationality provides a direction where people should put their efforts if they want to become less wrong about stuff. Are people simply unwilling to put in that effort?
Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns?
More the latter. Most of the things that a person could learn about are things that won’t help them directly. Agreed that if one has poor epistemic rationality, it’s hard to do the instrumental rationality part correctly (“I know, I’ll fix this problem by wishing!”).
Instrumental rationality, among other things, points people to whichever of their skills or abilities is currently the performance bottleneck and encourages them to work on that, not the thing that’s most fun to work on. So we would still expect instrumental rationalists to win in this model.
(Yes, epistemic rationality might not lead to winning as directly.)
Why would that be? Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns? Epistemic rationality provides a direction where people should put their efforts if they want to become less wrong about stuff. Are people simply unwilling to put in that effort?
People may underestimate the amount and kind of information they need to turn epistemic rationality into instrumental rationality.
People may underestimate the value of clearly stated and expressed and communicated preferences.
More the latter. Most of the things that a person could learn about are things that won’t help them directly. Agreed that if one has poor epistemic rationality, it’s hard to do the instrumental rationality part correctly (“I know, I’ll fix this problem by wishing!”).