the development of a new ‘mental martial art’ of systematically correct reasoning
Unpopular opinion: Rationality is less about martial arts moves than about adopting an attitude of intellectual good faith and consistently valuing impartial truth-seeking above everything else that usually influences belief selection. Motivating people (including oneself) to adopt such an attitude can be tricky, but the attitude itself is simple. Inventing new techniques is good but not necessary.
I agree with this in some ways! I think the rationality community as it is isn’t what the world needs most, since putting effort into being friendly and caring for each other in ways that try to increase people’s ability to discuss without social risk is IMO the core thing that’s needed for humans to become more rational right now.
IMO, the techniques are relatively quite easy to share once you have trust to talk about them, and merely require a lot of practice, but convincing large numbers of people that it’s safe to think things through in public without weirding out their friends seems to me to be likely to require making it safe to think things through in public without weirding out their friends. I think that scaling a technical+crafted culture solution to creating emotional safety to discuss what’s true, that results in many people putting regular effort into communicating friendliness toward strangers when disagreeing, would do a lot more than scaling discussion of specific techniques for humanity’s rationality.
The problem as I see it right now is that this only works if it is seriously massively scaled. I feel like I see the reason CFAR got excited about circling now—seems like you probably need emotional safety to discuss usefully. But I think circling was an interesting thing to learn from, not a general solution. I think we need to design an internet that creates emotional safety for most of its users.
With finesse, it’s possible to combine the techniques of truth-seeking with friendliness and empathy so that the techniques work even when the person you’re talking to doesn’t know them. That’s a good way to demonstrate the effectiveness of truth-seeking techniques.
It’s easiest to use such finesse on the individual level, but if you can identify general concepts which help you understand and create emotional safety for larger groups of people, you can scale it up. Values conversations require at least one of the parties involved to have an understanding of value-space, so they can recognize and show respect for how other people prioritize different values even as they introduce alternative priority ordering. Building a vocabulary for understanding value-space to enable productive values conversations on the global scale is one of my latest projects.
Unpopular opinion: Rationality is less about martial arts moves than about adopting an attitude of intellectual good faith and consistently valuing impartial truth-seeking above everything else that usually influences belief selection. Motivating people (including oneself) to adopt such an attitude can be tricky, but the attitude itself is simple. Inventing new techniques is good but not necessary.
I agree with this in some ways! I think the rationality community as it is isn’t what the world needs most, since putting effort into being friendly and caring for each other in ways that try to increase people’s ability to discuss without social risk is IMO the core thing that’s needed for humans to become more rational right now.
IMO, the techniques are relatively quite easy to share once you have trust to talk about them, and merely require a lot of practice, but convincing large numbers of people that it’s safe to think things through in public without weirding out their friends seems to me to be likely to require making it safe to think things through in public without weirding out their friends. I think that scaling a technical+crafted culture solution to creating emotional safety to discuss what’s true, that results in many people putting regular effort into communicating friendliness toward strangers when disagreeing, would do a lot more than scaling discussion of specific techniques for humanity’s rationality.
The problem as I see it right now is that this only works if it is seriously massively scaled. I feel like I see the reason CFAR got excited about circling now—seems like you probably need emotional safety to discuss usefully. But I think circling was an interesting thing to learn from, not a general solution. I think we need to design an internet that creates emotional safety for most of its users.
Thoughts on this balance, other folks?
With finesse, it’s possible to combine the techniques of truth-seeking with friendliness and empathy so that the techniques work even when the person you’re talking to doesn’t know them. That’s a good way to demonstrate the effectiveness of truth-seeking techniques.
It’s easiest to use such finesse on the individual level, but if you can identify general concepts which help you understand and create emotional safety for larger groups of people, you can scale it up. Values conversations require at least one of the parties involved to have an understanding of value-space, so they can recognize and show respect for how other people prioritize different values even as they introduce alternative priority ordering. Building a vocabulary for understanding value-space to enable productive values conversations on the global scale is one of my latest projects.