Rationalization is an important skill and should be rewarded, not punished. If you never try to rationalize others’ decisions then you won’t notice when they actually do have a good justification, and if you never practice rationalization then you’ll never get good enough at it to find their justifications when they exist. The result is gross overconfidence in the stupidity of the opposing side and thus gross overconfidence in one’s own rationality. That leads to tragedies and atrocities, both personal and societal.
Hm, is perspective-taking the same skill that I was talking about? I can’t tell. Also I thought that Eliezer’s examples were phrased in the hypothetical, and thus it’d be rationalizing others’ beliefs/behavior, not one’s own. I’m not sure to what extent rationalizing a conclusion and rationalizing one’s own behavior are related. Introspectively, the defensiveness and self-justifying-ness inherent to the latter makes it a rather different animal.
“Coming up with a single, stupid explanation, failing to realize it is stupid, and then using it as an excuse to cease all further thought” is a very, very bad skill.
Thinking “well, but abandoning a sunk cost actually IS a negative future event” is smart IFF you then go “I’d be miserable for three days. How does that weigh against years spent in the program?”
It’s very, very bad, however, if you stop there and continue to spend 2 years on a PhD just because you don’t want to even THINK about those three days of misery.
I think understanding this dichotomy is critical. If you stop even thinking “well, but abandoning a sunk cost IS a negative future event” because you’re afraid of falling in to the trap of then avoiding all sunk costs, then you’re ignoring real negative consequences to your decisions.
Rationalization is an important skill and should be rewarded, not punished. If you never try to rationalize others’ decisions then you won’t notice when they actually do have a good justification, and if you never practice rationalization then you’ll never get good enough at it to find their justifications when they exist. The result is gross overconfidence in the stupidity of the opposing side and thus gross overconfidence in one’s own rationality. That leads to tragedies and atrocities, both personal and societal.
Perspective-taking is a separate “skill” from rationalizing one’s own behavior.
Hm, is perspective-taking the same skill that I was talking about? I can’t tell. Also I thought that Eliezer’s examples were phrased in the hypothetical, and thus it’d be rationalizing others’ beliefs/behavior, not one’s own. I’m not sure to what extent rationalizing a conclusion and rationalizing one’s own behavior are related. Introspectively, the defensiveness and self-justifying-ness inherent to the latter makes it a rather different animal.
“Coming up with explanations” is a good skill.
“Coming up with a single, stupid explanation, failing to realize it is stupid, and then using it as an excuse to cease all further thought” is a very, very bad skill.
Thinking “well, but abandoning a sunk cost actually IS a negative future event” is smart IFF you then go “I’d be miserable for three days. How does that weigh against years spent in the program?”
It’s very, very bad, however, if you stop there and continue to spend 2 years on a PhD just because you don’t want to even THINK about those three days of misery.
I think understanding this dichotomy is critical. If you stop even thinking “well, but abandoning a sunk cost IS a negative future event” because you’re afraid of falling in to the trap of then avoiding all sunk costs, then you’re ignoring real negative consequences to your decisions.