My weak definition of rationality: thinking about your own knowledge and beliefs from an outside perspective and updating/changing them if they are not helpful and don’t make sense (epistemic); noticing your goals, thinking a bit about how to achieve them, and then doing that on purpose to see if works, while paying attention if it’s not working so you can try something else; thinking about and trying to notice the actual consequences of your actions (instrumental).
Short: epistemic=believing things on purpose, instrumental=doing things on purpose for thought-out reasons.
I say weak because this isn’t a superpower; you can do it without being amazingly good at that (i.e. if you have an IQ of 90). But you can exercise without being amazingly good at any sport, and you still benefit from it. I think that also stands for basic rationality.
if a person’s behavior is seemingly irrational to you, there’s still some reason, maybe understood, maybe not understood, maybe misunderstood why the person behaves that way.
In a general sense, yeah. People operate inside causality. But people do things for a reason that they haven’t noticed, haven’t thought about, and might not agree with if they did think about. For example, Bob might find himself well on the path to alcoholism without realizing that his original, harmless-and-normal-seeming craving for a drink in the evening happened because it helped with his insomnia; a problem that could more healthily be addressed by booking a doctor’s appointment. (I pick this example because I recently caught myself in the early stages of this process). But from the inside, it doesn’t feel like the brain is fallible, and so even people who’ve come across research to the contrary feel like their introspection is always correct–let alone people who’ve never seen those ideas. I don’t think the IQ ceiling on understanding and benefiting from “I might be wrong about why I do this” is very high.
thinking about your own knowledge and beliefs from an outside perspective
Interesting. I’d probably call this self-reflection. I am also wary of the “if they are not helpful and don’t make sense” criterion—it seems to depend way too much on the way a person is primed (aka strong priors). For example, if I am a strongly believing Christian, live in a Christian community, have personal experiences of sensing the godhead, etc. any attempts to explain atheism to me will be met by “not helpful and doesn’t make sense”. And “believing things on purpose” also goes there—the same person purposefully believes in Lord Jesus.
Epistemic rationality should depend on comparison to reality, not to what makes sense to me at the moment.
For instrumental here are some things possibly missing: Cost-benefit analysis. Forecasting consequences of actions. Planning (in particular, long-term planning).
But I don’t know that you can’t find all that on the self-help shelf at B&N...
Short: epistemic=believing things on purpose, instrumental=doing things on purpose for thought-out reasons.
It’s worth noting that this is different from how CFAR and the Sequences tend to think about rationality. They would say that someone whose beliefs are relatively unreflective and unexamined but more reliably true is more epistemically rational than someone who less reliably true beliefs who has examined and evaluated those beliefs much more carefully. I believe they’d also say that someone who acts with less deliberation and has fewer explicit reasons, but reliably gets better results, is more rational than a more reflective but ineffective individual.
Agreed. And that makes sense as a way to compare a number of individuals at a single point in time. However, if you are starting at rationality level x, and you want to progress to rationality level y over time z, I’m not sure of a better way to do it than to think deliberately about your beliefs and actions. (This may include ‘copying people who appear to do better in life’; that constitutes ‘thinking about your beliefs/goals’). Although there may well be better ways.
Right. I’m making a point about the definition of ‘rationality’, not about the best way to become rational, which might very well be heavily reflective and intellectualizing. The distinction is important because the things we intuitively associate with ‘rationality’ (e.g., explicit reasoning) might empirically turn out not to always be useful, whereas (instrumental) rationality itself is, stipulatively, maximally useful. We want to insulate ourselves against regrets of rationality.
If having accurate beliefs about yourself reliably makes you lose, then those beliefs are (instrumentally) irrational to hold. If deliberating over what to do reliably makes you lose, then such deliberation is (instrumentally) irrational. If reflecting on your preferences and coming to understand your goals better reliably makes you lose, then such practices are (instrumentally) irrational.
My weak definition of rationality: thinking about your own knowledge and beliefs from an outside perspective and updating/changing them if they are not helpful and don’t make sense (epistemic); noticing your goals, thinking a bit about how to achieve them, and then doing that on purpose to see if works, while paying attention if it’s not working so you can try something else; thinking about and trying to notice the actual consequences of your actions (instrumental).
Short: epistemic=believing things on purpose, instrumental=doing things on purpose for thought-out reasons.
I say weak because this isn’t a superpower; you can do it without being amazingly good at that (i.e. if you have an IQ of 90). But you can exercise without being amazingly good at any sport, and you still benefit from it. I think that also stands for basic rationality.
In a general sense, yeah. People operate inside causality. But people do things for a reason that they haven’t noticed, haven’t thought about, and might not agree with if they did think about. For example, Bob might find himself well on the path to alcoholism without realizing that his original, harmless-and-normal-seeming craving for a drink in the evening happened because it helped with his insomnia; a problem that could more healthily be addressed by booking a doctor’s appointment. (I pick this example because I recently caught myself in the early stages of this process). But from the inside, it doesn’t feel like the brain is fallible, and so even people who’ve come across research to the contrary feel like their introspection is always correct–let alone people who’ve never seen those ideas. I don’t think the IQ ceiling on understanding and benefiting from “I might be wrong about why I do this” is very high.
Interesting. I’d probably call this self-reflection. I am also wary of the “if they are not helpful and don’t make sense” criterion—it seems to depend way too much on the way a person is primed (aka strong priors). For example, if I am a strongly believing Christian, live in a Christian community, have personal experiences of sensing the godhead, etc. any attempts to explain atheism to me will be met by “not helpful and doesn’t make sense”. And “believing things on purpose” also goes there—the same person purposefully believes in Lord Jesus.
Epistemic rationality should depend on comparison to reality, not to what makes sense to me at the moment.
For instrumental here are some things possibly missing: Cost-benefit analysis. Forecasting consequences of actions. Planning (in particular, long-term planning).
But I don’t know that you can’t find all that on the self-help shelf at B&N...
It’s worth noting that this is different from how CFAR and the Sequences tend to think about rationality. They would say that someone whose beliefs are relatively unreflective and unexamined but more reliably true is more epistemically rational than someone who less reliably true beliefs who has examined and evaluated those beliefs much more carefully. I believe they’d also say that someone who acts with less deliberation and has fewer explicit reasons, but reliably gets better results, is more rational than a more reflective but ineffective individual.
Agreed. And that makes sense as a way to compare a number of individuals at a single point in time. However, if you are starting at rationality level x, and you want to progress to rationality level y over time z, I’m not sure of a better way to do it than to think deliberately about your beliefs and actions. (This may include ‘copying people who appear to do better in life’; that constitutes ‘thinking about your beliefs/goals’). Although there may well be better ways.
Right. I’m making a point about the definition of ‘rationality’, not about the best way to become rational, which might very well be heavily reflective and intellectualizing. The distinction is important because the things we intuitively associate with ‘rationality’ (e.g., explicit reasoning) might empirically turn out not to always be useful, whereas (instrumental) rationality itself is, stipulatively, maximally useful. We want to insulate ourselves against regrets of rationality.
If having accurate beliefs about yourself reliably makes you lose, then those beliefs are (instrumentally) irrational to hold. If deliberating over what to do reliably makes you lose, then such deliberation is (instrumentally) irrational. If reflecting on your preferences and coming to understand your goals better reliably makes you lose, then such practices are (instrumentally) irrational.
Agreed that it’s a good distinction to make.