I started writing out some notes on my current impressions of the “rationality skill tree”. Then I had a vague sense of having written it before. It turned out to be background thoughts on why doublecrux is hard to learn, which (surprise!) I also thought were key background skills for many other rationality practices.
I haven’t rewritten this yet to be non-double-crux-centric, but think that’d be good to do someday. (The LW team has been chatting about wikis lately, and this feels like something I’d eventually want written up in a way it could be easily collaboratively added to)
Epistemic humility (“I could be the wrong person here”)
Good Faith (“I trust the other person to be believing things that make sense to them, which I’d have ended up believing if I were exposed to the same stimuli, and that they are generally trying to find the the truth”)
Confidence in the existence of objective truth
Curiosity / Desire to uncover truth
Building-Block and Meta Skills
(Necessary or at least very helpful to learn everything else)
Ability to notice things (there are many types of things worth noticing, but most-obviously-relevant are)
cognitive states
ways-that-ideas-fit-together
physiological states
conversational patterns
felt senses (see focusing).
Ability to introspect and notice your internal states (Focusing )
Ability to induce a mental state or reframe [note: alas, the original post here is gone]
Habit of gaining habits
Notice you are in a failure mode, and step out. Examples:
You are fighting to make sure an side/argument wins
You are fighting to make another side/argument lose (potentially jumping on something that seems allied to something/someone you consider bad/dangerous)
You are incentivized to believe something, or not to notice something, because of social or financial rewards,
You’re incentivized not to notice something or think it’s important because it’d be physically inconvenient/annoying
You are offended/angered/defensive/agitated
You’re afraid you’ll lose something important if you lose a belief (possibly ‘bucket errors’)
You’re rounding a person’s statement off to the nearest stereotype instead of trying to actually understand and response to what they’re saying
You’re arguing about definitions of words instead of ideas
Notice “freudian slip” ish things that hint that you’re thinking about something in an unhelpful way. (for example, while writing this, I typed out “your opponent” to refer to the person you’re Double Cruxing with, which is a holdover from treating it like an adversarial debate)
(The “Step Out” part can be pretty hard and would be a long series of blogposts, but hopefully this at least gets across the ideas to shoot for)
Social Skills (i.e. not feeding into negative spirals, noticing what emotional state or patterns other people are in [*without* accidentaly rounding them off to a stereotype])
Ability to tactfully disagree in a way that arouses curiosity rather than defensiveness
Leaving your colleague a line of retreat (i.e. not making them lose face if they change their mind)
Socially reward people who change their mind (in general, frequently, so that your colleague trusts that you’ll do so for them)
Ability to listen (in a way that makes someone feel listened to) so they feel like they got to actually talk, which makes them inclined to listen as well
Ability to notice if someone else seems to be in one of the above failure modes (and then, ability to point it out gently)
Cultivate empathy and curiosity about other people so the other social skills come more naturally, and so that even if you don’t expect them to be right, you can see them as helpful to at least understand their reasoning (fleshing out your model of how other people might think)
Ability to communicate in (and to listen to) a variety of styles of conversation, “code switching”, learning another person’s jargon or explaining yours without getting frustrated
Habit asking clarifying questions, that help your partner find the Crux of their beliefs.
Actually Thinking About Things
Understanding when and how to apply math, statistics, etc
Practice various creativity related things that help you brainstorm ideas, notice implications of things, etc
Operationalize vague beliefs into concrete predictions
Actually Changing Your Mind
Notice when you are confused or surprised and treat this as a red flag that something about your models is wrong (either you have the wrong model or no model)
Ability to identify what the actual Crux of your beliefs are.
Ability to track bits of small bits of evidence that are accumulating. If enough bits of evidence have accumulated that you should at least be taking an idea *seriously* (even if not changing your mind yet), go through motions of thinking through what the implications WOULD be, to help future updates happen more easily.
If enough evidence has accumulated that you should change your mind about a thing… like, actually do that. See the list of failure modes above that may prevent this. (That said, if you have a vague nagging sense that something isn’t right even if you can’t articulate it, try to focus on that and flesh it out rather than trying to steamroll over it)
Explore Implications: When you change your mind on a thing, don’t just acknowledge, actually think about what other concepts in your worldview should change. Do this
because it *should* have other implications, and it’s useful to know what they are....
because it’ll help you actually retain the update (instead of letting it slide away when it becomes socially/politically/emotionally/physically inconvenient to believe it, or just forgetting)
If you notice your emotions are not in line with what you now believe the truth to be (in a system-2 level), figure out why that is.
Noticing Disagreement and Confusion, and then putting in the work to resolve it
If you have all the above skills, and your partner does too, and you both trust that this is the case, you can still fail to make progress if you don’t actually follow up, and schedule the time to talk through the issues thoroughly. For deep disagreement this can take years. It may or may not be worth it. But if there are longstanding disagreements that continuously cause strife, it may be worthwhile.
I started writing out some notes on my current impressions of the “rationality skill tree”. Then I had a vague sense of having written it before. It turned out to be background thoughts on why doublecrux is hard to learn, which (surprise!) I also thought were key background skills for many other rationality practices.
I haven’t rewritten this yet to be non-double-crux-centric, but think that’d be good to do someday. (The LW team has been chatting about wikis lately, and this feels like something I’d eventually want written up in a way it could be easily collaboratively added to)