How do people fail to improve their rationality? How do they accidentally harm themselves in the process? I’m thinking of writing a post “How not to improve your rationality” or “A nuanced guide to reading the sequences” that preempts common mistakes, and I’d appreciate hearing people’s experiences. Some examples:
It took me an absurdly long time (like, 1-2yr in the rat community) before I realized you don’t correct for cognitive biases, you have to “be introspectively aware of the bias occuring, and remain unmoved by it” (as Eliezer put it in a podcast)
More generally, people can read about a bias and resolve to “do better” without concretely deciding what to do differently. This typically makes things worse, e.g. I have a friend who tried really hard to avoid the typical mind fallacy, and accidentally turned off her empathy in the process.
The implicit frame rationalists push is logical and legible, and can lead to people distrusting their emotions. And I think it’s really important to listen to listen to ick feelings when changing your thought processes, as there can be non obvious effects.
E.g. My friend started thinking about integrity in terms of FDT, and this disconnected it from their motivational circuits and they made some pretty big mistakes because of it. If they’d listened to their feeling of “this is a weird way to think” this wouldn’t have happened.
(I think many people misinterpret sequence posts and decide to change their thinking in bad ways, and listening to your feelings can be a nice emergency check.)
[Question] What rationality failure modes are there?
How do people fail to improve their rationality? How do they accidentally harm themselves in the process? I’m thinking of writing a post “How not to improve your rationality” or “A nuanced guide to reading the sequences” that preempts common mistakes, and I’d appreciate hearing people’s experiences. Some examples:
It took me an absurdly long time (like, 1-2yr in the rat community) before I realized you don’t correct for cognitive biases, you have to “be introspectively aware of the bias occuring, and remain unmoved by it” (as Eliezer put it in a podcast)
More generally, people can read about a bias and resolve to “do better” without concretely deciding what to do differently. This typically makes things worse, e.g. I have a friend who tried really hard to avoid the typical mind fallacy, and accidentally turned off her empathy in the process.
The implicit frame rationalists push is logical and legible, and can lead to people distrusting their emotions. And I think it’s really important to listen to listen to ick feelings when changing your thought processes, as there can be non obvious effects.
E.g. My friend started thinking about integrity in terms of FDT, and this disconnected it from their motivational circuits and they made some pretty big mistakes because of it. If they’d listened to their feeling of “this is a weird way to think” this wouldn’t have happened.
(I think many people misinterpret sequence posts and decide to change their thinking in bad ways, and listening to your feelings can be a nice emergency check.)