Short: epistemic=believing things on purpose, instrumental=doing things on purpose for thought-out reasons.
It’s worth noting that this is different from how CFAR and the Sequences tend to think about rationality. They would say that someone whose beliefs are relatively unreflective and unexamined but more reliably true is more epistemically rational than someone who less reliably true beliefs who has examined and evaluated those beliefs much more carefully. I believe they’d also say that someone who acts with less deliberation and has fewer explicit reasons, but reliably gets better results, is more rational than a more reflective but ineffective individual.
Agreed. And that makes sense as a way to compare a number of individuals at a single point in time. However, if you are starting at rationality level x, and you want to progress to rationality level y over time z, I’m not sure of a better way to do it than to think deliberately about your beliefs and actions. (This may include ‘copying people who appear to do better in life’; that constitutes ‘thinking about your beliefs/goals’). Although there may well be better ways.
Right. I’m making a point about the definition of ‘rationality’, not about the best way to become rational, which might very well be heavily reflective and intellectualizing. The distinction is important because the things we intuitively associate with ‘rationality’ (e.g., explicit reasoning) might empirically turn out not to always be useful, whereas (instrumental) rationality itself is, stipulatively, maximally useful. We want to insulate ourselves against regrets of rationality.
If having accurate beliefs about yourself reliably makes you lose, then those beliefs are (instrumentally) irrational to hold. If deliberating over what to do reliably makes you lose, then such deliberation is (instrumentally) irrational. If reflecting on your preferences and coming to understand your goals better reliably makes you lose, then such practices are (instrumentally) irrational.
It’s worth noting that this is different from how CFAR and the Sequences tend to think about rationality. They would say that someone whose beliefs are relatively unreflective and unexamined but more reliably true is more epistemically rational than someone who less reliably true beliefs who has examined and evaluated those beliefs much more carefully. I believe they’d also say that someone who acts with less deliberation and has fewer explicit reasons, but reliably gets better results, is more rational than a more reflective but ineffective individual.
Agreed. And that makes sense as a way to compare a number of individuals at a single point in time. However, if you are starting at rationality level x, and you want to progress to rationality level y over time z, I’m not sure of a better way to do it than to think deliberately about your beliefs and actions. (This may include ‘copying people who appear to do better in life’; that constitutes ‘thinking about your beliefs/goals’). Although there may well be better ways.
Right. I’m making a point about the definition of ‘rationality’, not about the best way to become rational, which might very well be heavily reflective and intellectualizing. The distinction is important because the things we intuitively associate with ‘rationality’ (e.g., explicit reasoning) might empirically turn out not to always be useful, whereas (instrumental) rationality itself is, stipulatively, maximally useful. We want to insulate ourselves against regrets of rationality.
If having accurate beliefs about yourself reliably makes you lose, then those beliefs are (instrumentally) irrational to hold. If deliberating over what to do reliably makes you lose, then such deliberation is (instrumentally) irrational. If reflecting on your preferences and coming to understand your goals better reliably makes you lose, then such practices are (instrumentally) irrational.
Agreed that it’s a good distinction to make.