Bo, the point is that what’s most difficult in these cases isn’t the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren’t specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.
Eliezer talks about allocating “some uninterrupted hours”, but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I’ve got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it’s right after all (in which case I shouldn’t change my mind in a hurry) or I’ve demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I’m less likely to spend the rest of my life worrying that I missed something important.
Yes, of course, a perfect reasoner would be able to sit down and go through all the key points quickly and methodically, and wouldn’t take months to do it. (Unless there were a big pile of empirical evidence that needed gathering.) But if you find yourself needing a Crisis of Faith, then ipso facto you aren’t a perfect reasoner on the topic in question.
Wherefore, I at least don’t have the time to stage a Crisis of Faith about every deeply held belief that shows signs of meriting one.
I think there would be value in some OB posts about resource allocation: deciding which biases to attack first, how much effort to put into updating which beliefs, how to prioritize evidence-gathering versus theorizing, and so on and so forth. (We can’t Make An Extraordinary Effort every single time.) It’s a very important aspect of practical rationality.
To those who are saying things like “Eliezer, someone will get power anyway and they’ll probably be worse than you, so why not grab power for yourself?”, and assuming for the sake of argument that we’re talking about some quantity of power that Eliezer is actually in a position to grab: If you grab power and it corrupts you, that’s bad not only for everyone else but also for you and whatever your goals were before you got corrupted. Observing that other people would be corrupted just as badly defuses the first of those objections to power-grabbing, but not the second.