Yeah, that’s a good question. I haven’t given any advice as to how to set up a mental compartment, and I doubt it’s the sort of advice you’re going to find around these parts :-)
Setting up a mental compartment is easier than it looks.
First, pick the idea that you want to “believe” in the compartment.
Second, look for justifications for the idea and evidence for the idea. This should be easy, because your brain is very good at justifying things. It doesn’t matter if the evidence is weak, just pour it in there: don’t treat it as weak probabilistic evidence, treat it as “tiny facts”.
It’s very important that, during this process, you ignore all counter-evidence. Pick and choose what you listen to. If you’ve been a rationalist for a while, this may sound difficult, but it’s actually easy. You’re brain is very good at reading counter-evidence and disregarding it offhand if it doesn’t agree with what you “know”. Fuel that confirmation bias.
Proceed to regulate information intake into the compartment. If you’re trying to build up “Nothing is Beyond My Grasp”, then every time that you succeed at something, feed that pride and success into the compartment. Every time you fail, though, simply remind yourself that you knew it was a compartment, and this isn’t too surprising, and don’t let the compartment update.
Before long, you’ll have this discontinuous belief that’s completely out of sync with reality.
Hmm. This is the machinery of compartmentalization at play—you can get evidence against belief X and update on it while still maintaining belief X in full force. I’m not sure I can articulate how it’s done. It’s just… the beliefs live in separate magisteria, you see.
This is hard to explain in part because preventing the compartment from updating is not an act of will, it’s an act of omission. You build this wall between your belief systems, and then updates no longer propagate across the wall unless you force them to. To prevent compartment updates, you simply don’t force the compartment to update.
This may sound difficult, but I find that in practice it’s so easy it’s scary. This may differ from person to person, and I wouldn’t be surprised if many people find my methods difficult.
I wouldn’t be surprised if not everybody could do this at all. Could be typical mind fallacy to assume such. Human minds differ quite a lot. Just think about
visual vs. metaphorical ‘imagination’
weight on abstract/sybolic vs. on concrete/fuzzy reasoning.
Yeah, that’s a good question. I haven’t given any advice as to how to set up a mental compartment, and I doubt it’s the sort of advice you’re going to find around these parts :-)
Setting up a mental compartment is easier than it looks.
First, pick the idea that you want to “believe” in the compartment.
Second, look for justifications for the idea and evidence for the idea. This should be easy, because your brain is very good at justifying things. It doesn’t matter if the evidence is weak, just pour it in there: don’t treat it as weak probabilistic evidence, treat it as “tiny facts”.
It’s very important that, during this process, you ignore all counter-evidence. Pick and choose what you listen to. If you’ve been a rationalist for a while, this may sound difficult, but it’s actually easy. You’re brain is very good at reading counter-evidence and disregarding it offhand if it doesn’t agree with what you “know”. Fuel that confirmation bias.
Proceed to regulate information intake into the compartment. If you’re trying to build up “Nothing is Beyond My Grasp”, then every time that you succeed at something, feed that pride and success into the compartment. Every time you fail, though, simply remind yourself that you knew it was a compartment, and this isn’t too surprising, and don’t let the compartment update.
Before long, you’ll have this discontinuous belief that’s completely out of sync with reality.
That sounds plausible.
It is not clear how to do this.
Hmm. This is the machinery of compartmentalization at play—you can get evidence against belief X and update on it while still maintaining belief X in full force. I’m not sure I can articulate how it’s done. It’s just… the beliefs live in separate magisteria, you see.
This is hard to explain in part because preventing the compartment from updating is not an act of will, it’s an act of omission. You build this wall between your belief systems, and then updates no longer propagate across the wall unless you force them to. To prevent compartment updates, you simply don’t force the compartment to update.
This may sound difficult, but I find that in practice it’s so easy it’s scary. This may differ from person to person, and I wouldn’t be surprised if many people find my methods difficult.
I wouldn’t be surprised if not everybody could do this at all. Could be typical mind fallacy to assume such. Human minds differ quite a lot. Just think about
visual vs. metaphorical ‘imagination’
weight on abstract/sybolic vs. on concrete/fuzzy reasoning.
eidetic memory or not
synaestesis or not