I’d like to see LessWrong 2.0 adopting zero-tolerance policy against politics and religion.
In good old fashioned tradition, we might start with tabooing religion. I don’t think cousin_it has a problem with having smart religious people on LessWrong. He would likely prefer it if Ilya would still participate on LessWrong.
I think his concern is rather about a project like Dragon Army copying structures from religious organizations and the LessWrong community having solstice celebrations filled with ritual.
I agree that there are different things one can possibly dislike about religion, and it would be better to be more precise.
For me, the annoying aspects are applying double standards of evidence (it would be wrong to blindly believe what random Joe says about theory of relativity, but it is perfectly okay and actually desirable to blindly believe what random Joe said a few millenia ago about the beginning of universe), speaking incoherent sentences (e.g. “god is love”), twisting one’s logic and morality to fit the predetermined bottom line (a smart and powerful being who decides that billions of people need to suffer and die because someone stole a fucking apple from his garden is still somehow praised as loving and sane), etc. If LW is an attempt to increase sanity, this is among the lower hanging fruit. It’s like someone participating on a website about advanced math, while insisting that 2+2=5, and people saying “well, I don’t agree, but it would be rude to publicly call them wrong”.
But I can’t talk for cousin_it, and maybe we are concerned with completely different things.
I personally can’t remember anybody saying “God is love” on LessWrong. On the other hand, I read recently of people updating in the direction that kabbalistic wisdom might not be completely bogus after reading Unsong.
Scott has this creepy mental skill where he could steelman a long string of random ones and zeroes, and some people would believe it contains the deepest secret to the universe.
I’d like to imagine that Scott is doing this to create a control group for his usual articles. By comparing how many people got convinced by his serious articles and how many people got convinced by his attempts to steelman nonsense, he can evaluate whether people agree with him because of his ideas or because of his hypnotic writing. :D
In good old fashioned tradition, we might start with tabooing religion. I don’t think cousin_it has a problem with having smart religious people on LessWrong. He would likely prefer it if Ilya would still participate on LessWrong. I think his concern is rather about a project like Dragon Army copying structures from religious organizations and the LessWrong community having solstice celebrations filled with ritual.
You’re right on both counts. Ilya is awesome, and rationalist versions of religious activities feel creepy to me.
I agree that there are different things one can possibly dislike about religion, and it would be better to be more precise.
For me, the annoying aspects are applying double standards of evidence (it would be wrong to blindly believe what random Joe says about theory of relativity, but it is perfectly okay and actually desirable to blindly believe what random Joe said a few millenia ago about the beginning of universe), speaking incoherent sentences (e.g. “god is love”), twisting one’s logic and morality to fit the predetermined bottom line (a smart and powerful being who decides that billions of people need to suffer and die because someone stole a fucking apple from his garden is still somehow praised as loving and sane), etc. If LW is an attempt to increase sanity, this is among the lower hanging fruit. It’s like someone participating on a website about advanced math, while insisting that 2+2=5, and people saying “well, I don’t agree, but it would be rude to publicly call them wrong”.
But I can’t talk for cousin_it, and maybe we are concerned with completely different things.
I personally can’t remember anybody saying “God is love” on LessWrong. On the other hand, I read recently of people updating in the direction that kabbalistic wisdom might not be completely bogus after reading Unsong.
Scott has this creepy mental skill where he could steelman a long string of random ones and zeroes, and some people would believe it contains the deepest secret to the universe.
I’d like to imagine that Scott is doing this to create a control group for his usual articles. By comparing how many people got convinced by his serious articles and how many people got convinced by his attempts to steelman nonsense, he can evaluate whether people agree with him because of his ideas or because of his hypnotic writing. :D