In any case, ‘New Atheists’ like Dawkins and Harris are raising the sanity waterline, albeit in a relatively confrontational manner.
(emphasis added)
I don’t think that’s right. If your anecdote is more common than Swimmer963, then perhaps. If Swimmer963′s anecdote is more common, and those people would otherwise have found atheism attractive, then they’re doing the opposite.
(Saying obvious things: Protestantism turned me off Christianity for a long time, and Kurzweil turns many people off singularity-related memes. Many are skeptical of ufology because the most common explanation of ufo phenomena is biological extraterrestrials. I heard somewhere that something similar happened with cryonics and nanotechnology. An innocent person is a lot more receptive than someone who has heard the retarded version of an idea. To paraphrase Schopenhauer, it is not weakness of the cognitive faculties that leads people astray, it is preconception, prejudice.)
An innocent person is a lot more receptive than someone who has heard the retarded version of an idea. To paraphrase Schopenhauer, it is not weakness of the cognitive faculties that leads people astray, it is preconception, prejudice.)
How do we know that the situation with various crackpot ideas is any different? We don’t actually go and spend weeks seeking out and dissecting the most sane version of every conspiracy we’ve caught wind of. How can we be so certain that if we did that we wouldn’t find some non-obvious truths?
I’d modus tollens your modus ponens. Except for ideas with only one version, like Timecube, there are none-obvious truths which can be extracted. For instance, Mormons have contributed positively to the LW memepool. But the marginal cost of delving deep into real crackpottery probably isn’t worth the marginal benefit in truth.
This is all well and good, but imagine that, instead of living in a word where people generally don’t communicate optimally and tend to irrationally cling to their memes, we live in the world of rational discourse, where truths are allowed to naturally bubble up to the surface and manifest as similar conclusions from disparate experiences.
In this hypothetical world you would benefit from arguing with a crackpot — you would supply xem with the evidence xe overlooked (because from within xyr model it felt irrelevant, so xe didn’t pursue it — that’s how I imagine one could end up with crackpot beliefs in a rational world), and xyr non-obvious truth would come up as a reason for xyr weird world-view. In that situation marginal benefit of engagement is high, because behind most crackpot theories there would be an extremely rare, and thus valuable experience (= piece of evidence about a nature of your common world), and the marginal cost of engagement is diminished because your effort is expended on adjusting both your and xyr map, and not on defeating their cognitive defenses.
With me so far? It gets better. There’s no hard and fast boundary between our world and the one painted above. And there are different kinds of crackpots. I’m pretty sure that there are many people with beliefs that you have good enough reasons to dismiss, yet which make total sense to somebody with their experiences. And many of them can be argued with. They may be genuinely in interested in finding the truth, or winning at life, or hearing out contrarian opinions. They may be not shunned by society enough to develop thick defenses. They may be smart and rational (as far as humans go, which is not very far.)
So finding the right kind of crackpots becomes a lucrative problem — source of valuable insights and debating practice.
(emphasis added)
I don’t think that’s right. If your anecdote is more common than Swimmer963, then perhaps. If Swimmer963′s anecdote is more common, and those people would otherwise have found atheism attractive, then they’re doing the opposite.
(Saying obvious things: Protestantism turned me off Christianity for a long time, and Kurzweil turns many people off singularity-related memes. Many are skeptical of ufology because the most common explanation of ufo phenomena is biological extraterrestrials. I heard somewhere that something similar happened with cryonics and nanotechnology. An innocent person is a lot more receptive than someone who has heard the retarded version of an idea. To paraphrase Schopenhauer, it is not weakness of the cognitive faculties that leads people astray, it is preconception, prejudice.)
How do we know that the situation with various crackpot ideas is any different? We don’t actually go and spend weeks seeking out and dissecting the most sane version of every conspiracy we’ve caught wind of. How can we be so certain that if we did that we wouldn’t find some non-obvious truths?
I’d modus tollens your modus ponens. Except for ideas with only one version, like Timecube, there are none-obvious truths which can be extracted. For instance, Mormons have contributed positively to the LW memepool. But the marginal cost of delving deep into real crackpottery probably isn’t worth the marginal benefit in truth.
This is all well and good, but imagine that, instead of living in a word where people generally don’t communicate optimally and tend to irrationally cling to their memes, we live in the world of rational discourse, where truths are allowed to naturally bubble up to the surface and manifest as similar conclusions from disparate experiences.
In this hypothetical world you would benefit from arguing with a crackpot — you would supply xem with the evidence xe overlooked (because from within xyr model it felt irrelevant, so xe didn’t pursue it — that’s how I imagine one could end up with crackpot beliefs in a rational world), and xyr non-obvious truth would come up as a reason for xyr weird world-view. In that situation marginal benefit of engagement is high, because behind most crackpot theories there would be an extremely rare, and thus valuable experience (= piece of evidence about a nature of your common world), and the marginal cost of engagement is diminished because your effort is expended on adjusting both your and xyr map, and not on defeating their cognitive defenses.
With me so far? It gets better. There’s no hard and fast boundary between our world and the one painted above. And there are different kinds of crackpots. I’m pretty sure that there are many people with beliefs that you have good enough reasons to dismiss, yet which make total sense to somebody with their experiences. And many of them can be argued with. They may be genuinely in interested in finding the truth, or winning at life, or hearing out contrarian opinions. They may be not shunned by society enough to develop thick defenses. They may be smart and rational (as far as humans go, which is not very far.)
So finding the right kind of crackpots becomes a lucrative problem — source of valuable insights and debating practice.
Weakly related: http://lesswrong.com/lw/1kh/the_correct_contrarian_cluster/ and http://en.wikipedia.org/wiki/God_of_the_gaps