Bo, the point is that what’s most difficult in these cases isn’t the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren’t specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.
Eliezer talks about allocating “some uninterrupted hours”, but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I’ve got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it’s right after all (in which case I shouldn’t change my mind in a hurry) or I’ve demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I’m less likely to spend the rest of my life worrying that I missed something important.
Yes, of course, a perfect reasoner would be able to sit down and go through all the key points quickly and methodically, and wouldn’t take months to do it. (Unless there were a big pile of empirical evidence that needed gathering.) But if you find yourself needing a Crisis of Faith, then ipso facto you aren’t a perfect reasoner on the topic in question.
Wherefore, I at least don’t have the time to stage a Crisis of Faith about every deeply held belief that shows signs of meriting one.
I think there would be value in some OB posts about resource allocation: deciding which biases to attack first, how much effort to put into updating which beliefs, how to prioritize evidence-gathering versus theorizing, and so on and so forth. (We can’t Make An Extraordinary Effort every single time.) It’s a very important aspect of practical rationality.
If I believe something that’s wrong, it’s probably because I haven’t thought about it, merely how nice it is that it’s true, or how I should believe it… or I’ve just been rehearsing what I’ve read in books about how you should think about it. A few uninterrupted hours is probably enough to get the process of actually thinking about it started.
Bo, the point is that what’s most difficult in these cases isn’t the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren’t specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.
Eliezer talks about allocating “some uninterrupted hours”, but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I’ve got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it’s right after all (in which case I shouldn’t change my mind in a hurry) or I’ve demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I’m less likely to spend the rest of my life worrying that I missed something important.
Yes, of course, a perfect reasoner would be able to sit down and go through all the key points quickly and methodically, and wouldn’t take months to do it. (Unless there were a big pile of empirical evidence that needed gathering.) But if you find yourself needing a Crisis of Faith, then ipso facto you aren’t a perfect reasoner on the topic in question.
Wherefore, I at least don’t have the time to stage a Crisis of Faith about every deeply held belief that shows signs of meriting one.
I think there would be value in some OB posts about resource allocation: deciding which biases to attack first, how much effort to put into updating which beliefs, how to prioritize evidence-gathering versus theorizing, and so on and so forth. (We can’t Make An Extraordinary Effort every single time.) It’s a very important aspect of practical rationality.
If I believe something that’s wrong, it’s probably because I haven’t thought about it, merely how nice it is that it’s true, or how I should believe it… or I’ve just been rehearsing what I’ve read in books about how you should think about it. A few uninterrupted hours is probably enough to get the process of actually thinking about it started.