Why does general “rationality” skill prevent these things, rather than (or better than) situation-specific knowledge? Yes, if you were a good rationalist and could apply it to all parts of your life with ease, yes, you’d dodge the problems listed. But does this generalize to calling the development of rationalist skills a good use of your time per unit disutility avoided?
Rationality, I think, has to develop in you pretty well before it becomes the source of your resistance to these problems.
So I’m not sure if these are costs of irrationality per se, but rather, of lacking both well-developed rationality, and a specific insight.
I think it’s pretty obvious from the examples that irrationality in general is causing massive amounts of damage in society, all the time. Even mild improvements in people’s average rationality would probably pay themselves back many-fold.
The question of whether it’d more efficient to just teach people the situation-specific knowledge versus general rationality skills is an open one. Certainly teaching even just situation-specific knowledge would probably be worthwhile, and it’s likely that it would be easier. On the other hand, if you only teach situation-specific knowledge, then they might not be able to apply it properly in similar but related situations, and it will only help protect people against the problems that you know exist. General rationality skills would help even against problems you don’t know exist.
I think it’s pretty obvious from the examples that irrationality in general is causing massive amounts of damage in society, all the time. Even mild improvements in people’s average rationality would probably pay themselves back many-fold.
In the sense that a certain amount of improvement in rationality would prevent these things, yes, you’re right. But I disagree with this promotion of it as “the” cause of these failures, because you have to be pretty advanced in your level of rationality and willingness to apply it (e.g. against social pressure) before your rationality “automatically” causes you to save yourself from these negative events.
I don’t know. Rationality certainly doesn’t automatically protect you against any of these, but I suspect that even a very basic thing like being in the habit of actually thinking about things some more in a critical light would already have given many people a much better chance of avoiding most of these. Stanovich:
Several studies have shown that practice at the simple strategy of triggering the thought “think of the opposite” can help to prevent a host of the thinking errors studied in the heuristics and biases literature, including but not limited to: anchoring biases, overconfidence effects, hindsight bias, confirmation bias, and self-serving biases.
Several of those were probably also involved in many of the examples listed.
(The techniques discussed for promoting rationality that are discussed in the book will be the topic of a separate post.)
Rationality certainly doesn’t automatically protect you against any of these
I meant “automatic” in the sense that you don’t feel your are doing anything different with your ritual of cognition when you make the choice (that avoids whatever calamity is in the example).
Several studies have shown that practice at the simple strategy of triggering the thought “think of the opposite”
That seems like a benefit to a specific heuristic, not “rationality ” per se. I agree that simple heuristics can be very powerful, but not because they’re instances of someone’s general improvement in the “rationality” skill.
Rationality is when you can understand why the “think of the opposite” heuristic works and can come up with such effective strategies on your own.
ETA: A better way to put what I’m saying: intelligence is in the being who writes the program or builds the computer, not the computer that executes it. It’s in the one who comes up with the simple but effective rule, not the one who’s capable of implementing it.
I think we have different definitions for rationality. For me, teaching beginning-level rationality is just teaching people to recognize various biases, teaching them useful heuristics, and so forth. Coming up with your own heuristics is a more advanced skill, but a part of the same discipline nonetheless.
If you’re teaching people to program, you start by teaching them a specific programming language and how to do some basic stuff in that one. If you’re teaching people math, you start by some specific subarea of math and practice problems. If you’re teaching a foreign language, you start with some basic rules of grammar and vocabulary. And so on. It’s very rare that you’d be able to directly teach “the general X skill”, regardless of what X was. Instead, you teach relatively specific stuff, and gradually they’ll learn to think in the way required by the skill.
I don’t disagree with that. What I’m saying, rather, is that you shouldn’t try to persuade someone to take computer science 101 on the grounds that, “hey, programmers make a lot of money!”, or that, “They hand out good candy in CompSci 101!”, both of which you seem to be doing here. (The latter metaphor refers to you describing benefits that can be had without having to learn rationality per se.)
I’m not sure of what I originally intended, because I didn’t really think in those terms while writing the post, but afterwards I’d say that the most reasonable way to use the post to argue would be “these are the kinds of problems you’ll encounter in your life, if you take a CS101 class you’ll learn to deal with some of them and it’ll be at least somewhat useful, and if you want to you can take more classes and learn how to deal with even more problems of this kind”.
Up vote, because you’re right, but Kaj’s intention here was to provide real-world examples for convincing the layman of rationality’s value. A similar example for the world of politics:
“I don’t care about politics.”
“But you smoke weed, right? That’s a political issue—politics boils down to whether or not you have to worry about the cops knocking on your door and busting you for non-criminal behaviour.”
So while his examples aren’t perfect from our perspective, what he’s really aiming for is situations which can be exploited through the dark arts to convince people to be rational. If I can come up with any better ones, I’ll post them later today.
Up vote, because you’re right, but Kaj’s intention here was to provide real-world examples for convincing the layman of rationality’s value. A similar example for the world of politics:...
Well, then it seems like exaggeration to the point of lying. It’s more like if you said, “you should exercise so you can look like Arnold Schwarzenegger / Megan Fox and get roles in big Hollywood movies!”
Is it possible? Sure, if you put an insane level of effort into it and are in just the right circumstances, but it’s not a accurate picture to present to someone of the benefits.
I think instead you should teach a specific rationalist skill (like making knowledge truly part of yourself) and justify it by the more reachable benefits.
Why does general “rationality” skill prevent these things, rather than (or better than) situation-specific knowledge? Yes, if you were a good rationalist and could apply it to all parts of your life with ease, yes, you’d dodge the problems listed. But does this generalize to calling the development of rationalist skills a good use of your time per unit disutility avoided?
Rationality, I think, has to develop in you pretty well before it becomes the source of your resistance to these problems.
So I’m not sure if these are costs of irrationality per se, but rather, of lacking both well-developed rationality, and a specific insight.
Not sure if this is what you were asking, but...
I think it’s pretty obvious from the examples that irrationality in general is causing massive amounts of damage in society, all the time. Even mild improvements in people’s average rationality would probably pay themselves back many-fold.
The question of whether it’d more efficient to just teach people the situation-specific knowledge versus general rationality skills is an open one. Certainly teaching even just situation-specific knowledge would probably be worthwhile, and it’s likely that it would be easier. On the other hand, if you only teach situation-specific knowledge, then they might not be able to apply it properly in similar but related situations, and it will only help protect people against the problems that you know exist. General rationality skills would help even against problems you don’t know exist.
In the sense that a certain amount of improvement in rationality would prevent these things, yes, you’re right. But I disagree with this promotion of it as “the” cause of these failures, because you have to be pretty advanced in your level of rationality and willingness to apply it (e.g. against social pressure) before your rationality “automatically” causes you to save yourself from these negative events.
I don’t know. Rationality certainly doesn’t automatically protect you against any of these, but I suspect that even a very basic thing like being in the habit of actually thinking about things some more in a critical light would already have given many people a much better chance of avoiding most of these. Stanovich:
Several of those were probably also involved in many of the examples listed.
(The techniques discussed for promoting rationality that are discussed in the book will be the topic of a separate post.)
I meant “automatic” in the sense that you don’t feel your are doing anything different with your ritual of cognition when you make the choice (that avoids whatever calamity is in the example).
That seems like a benefit to a specific heuristic, not “rationality ” per se. I agree that simple heuristics can be very powerful, but not because they’re instances of someone’s general improvement in the “rationality” skill.
Rationality is when you can understand why the “think of the opposite” heuristic works and can come up with such effective strategies on your own.
ETA: A better way to put what I’m saying: intelligence is in the being who writes the program or builds the computer, not the computer that executes it. It’s in the one who comes up with the simple but effective rule, not the one who’s capable of implementing it.
I think we have different definitions for rationality. For me, teaching beginning-level rationality is just teaching people to recognize various biases, teaching them useful heuristics, and so forth. Coming up with your own heuristics is a more advanced skill, but a part of the same discipline nonetheless.
If you’re teaching people to program, you start by teaching them a specific programming language and how to do some basic stuff in that one. If you’re teaching people math, you start by some specific subarea of math and practice problems. If you’re teaching a foreign language, you start with some basic rules of grammar and vocabulary. And so on. It’s very rare that you’d be able to directly teach “the general X skill”, regardless of what X was. Instead, you teach relatively specific stuff, and gradually they’ll learn to think in the way required by the skill.
I don’t disagree with that. What I’m saying, rather, is that you shouldn’t try to persuade someone to take computer science 101 on the grounds that, “hey, programmers make a lot of money!”, or that, “They hand out good candy in CompSci 101!”, both of which you seem to be doing here. (The latter metaphor refers to you describing benefits that can be had without having to learn rationality per se.)
I’m not sure of what I originally intended, because I didn’t really think in those terms while writing the post, but afterwards I’d say that the most reasonable way to use the post to argue would be “these are the kinds of problems you’ll encounter in your life, if you take a CS101 class you’ll learn to deal with some of them and it’ll be at least somewhat useful, and if you want to you can take more classes and learn how to deal with even more problems of this kind”.
Up vote, because you’re right, but Kaj’s intention here was to provide real-world examples for convincing the layman of rationality’s value. A similar example for the world of politics:
“I don’t care about politics.”
“But you smoke weed, right? That’s a political issue—politics boils down to whether or not you have to worry about the cops knocking on your door and busting you for non-criminal behaviour.”
So while his examples aren’t perfect from our perspective, what he’s really aiming for is situations which can be exploited through the dark arts to convince people to be rational. If I can come up with any better ones, I’ll post them later today.
Well, then it seems like exaggeration to the point of lying. It’s more like if you said, “you should exercise so you can look like Arnold Schwarzenegger / Megan Fox and get roles in big Hollywood movies!”
Is it possible? Sure, if you put an insane level of effort into it and are in just the right circumstances, but it’s not a accurate picture to present to someone of the benefits.
I think instead you should teach a specific rationalist skill (like making knowledge truly part of yourself) and justify it by the more reachable benefits.