Shouldn’t the fact that so many smart people believe in God cause EY to give non-trivial weight to the possibility that his brain and those of his fellow atheists have a flaw which blinds then from seeing the truth of religion?
Let’s say a massive number of really smart people have thought a huge amount about proposition X and have concluded that it is true. Regardless of your evaluation of X and your evaluation of how other people evaluate X doesn’t a rationalist still have to believe that the chance of X being true is non-trivially greater than zero?
I think the fact that the Mind Projection Fallacy is a really strong bias in humans significantly decreases the weight of that possibility. Smart people think it may be true because that sounds like the easiest explanation, for a human, not because they actually thought a lot about it from a strictly rational point-of-view.
That’s some kind of general counter-argument against “trust the majority”, I think. When you learn that the majority has some kind of bias that supports its belief, you should decrease the strength you assign to the evidence “the majority thinks it’s true”. P(A|B)/P(A|!B) is small.
Be less ready to disagree with a supermajority than a mere majority; be less ready to disagree outside than inside your expertise; always pay close attention to the object-level arguments; never let the debate become about tribal status.
Shouldn’t the fact that so many smart people believe in God cause EY to give non-trivial weight to the possibility that his brain and those of his fellow atheists have a flaw which blinds then from seeing the truth of religion?
Let’s say a massive number of really smart people have thought a huge amount about proposition X and have concluded that it is true. Regardless of your evaluation of X and your evaluation of how other people evaluate X doesn’t a rationalist still have to believe that the chance of X being true is non-trivially greater than zero?
I think the fact that the Mind Projection Fallacy is a really strong bias in humans significantly decreases the weight of that possibility. Smart people think it may be true because that sounds like the easiest explanation, for a human, not because they actually thought a lot about it from a strictly rational point-of-view.
That’s some kind of general counter-argument against “trust the majority”, I think. When you learn that the majority has some kind of bias that supports its belief, you should decrease the strength you assign to the evidence “the majority thinks it’s true”. P(A|B)/P(A|!B) is small.
http://lesswrong.com/lw/jr/how_to_convince_me_that_2_2_3/
http://lesswrong.com/lw/qv/the_rhythm_of_disagreement/