Removal Suggestion: Bayesian Judo. I’m an atheist, too, but this post has a pile of issues:
His reasoning is “If this one thing is false, the entire religion is wrong”—but that’s a hasty generalization. I briefly explained in the comments why this cannot prove religion wrong.
Since the reasoning is poor, and he opens with “You can have some fun with people...” this really looks like he’s just putting in a “good, solid dig” like in the political mindkill piece.
Also, this is likely to scare off religious people by causing them mindkill right as they begin to read the sequences. It seems to me that a better objective would be to encourage them to read more.
Also, to avoid encouraging problems like undiscriminating skepticism and people confusing optimal things for rational ones and going the way of Ayn Rand’s objectivists, it is probably better not to put the following line in the first few sequences:
“There’s a theorem of rationality called Aumann’s Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong.”
Regardless of whether this is true, if new people aren’t inoculated against the conformity in thinking that this kind of statement encourages, it could make them more likely to behave like phyg members.
Removal Suggestion: Bayesian Judo. I’m an atheist, too, but this post has a pile of issues:
His reasoning is “If this one thing is false, the entire religion is wrong”—but that’s a hasty generalization. I briefly explained in the comments why this cannot prove religion wrong.
Since the reasoning is poor, and he opens with “You can have some fun with people...” this really looks like he’s just putting in a “good, solid dig” like in the political mindkill piece.
Also, this is likely to scare off religious people by causing them mindkill right as they begin to read the sequences. It seems to me that a better objective would be to encourage them to read more.
Also, to avoid encouraging problems like undiscriminating skepticism and people confusing optimal things for rational ones and going the way of Ayn Rand’s objectivists, it is probably better not to put the following line in the first few sequences:
“There’s a theorem of rationality called Aumann’s Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong.”
Regardless of whether this is true, if new people aren’t inoculated against the conformity in thinking that this kind of statement encourages, it could make them more likely to behave like phyg members.
Agreed. Related, Arresting irrational information cascades talks about point 4.