Another sign of slipping back is the newly positive attitude toward religion.
Is it really that bad? I haven’t noticed, but perhaps I was not paying enough attention, or my unconsciousness was trying to protect me by filtering out the most horrible things.
In case you only meant websites other than LW, I guess the definition of “rationalist community” has grown too far, and now means more or less “anyone who seems smart and either pays lip service to reason or is a friend with the right people”.
Not sure what conclusion should I make on this. I always felt wrong about censoring dissenters, and I still kinda do, but sometimes tolerating one smart religious person or one smart politically mindkilled person is all it takes to move the Overton window towards tolerating bullshit per se (as opposed to merely tolerating that this one specific smart person also believes some bullshit).
I’d like to see LessWrong 2.0 adopting zero-tolerance policy against politics and religion. I guess I can dream.
everyday rationality which amounts to unreliable self-help with rationalist words sprinkled on top.
Equations like “productivity equals intelligence plus joy minus square root of area under hyperbole of your procrastination” feel like self-help with rationality as attire.
But there is also some boring advice like: “pomodoros seem to help most people”.
I’d like to see LessWrong 2.0 adopting zero-tolerance policy against politics and religion.
In good old fashioned tradition, we might start with tabooing religion. I don’t think cousin_it has a problem with having smart religious people on LessWrong. He would likely prefer it if Ilya would still participate on LessWrong.
I think his concern is rather about a project like Dragon Army copying structures from religious organizations and the LessWrong community having solstice celebrations filled with ritual.
I agree that there are different things one can possibly dislike about religion, and it would be better to be more precise.
For me, the annoying aspects are applying double standards of evidence (it would be wrong to blindly believe what random Joe says about theory of relativity, but it is perfectly okay and actually desirable to blindly believe what random Joe said a few millenia ago about the beginning of universe), speaking incoherent sentences (e.g. “god is love”), twisting one’s logic and morality to fit the predetermined bottom line (a smart and powerful being who decides that billions of people need to suffer and die because someone stole a fucking apple from his garden is still somehow praised as loving and sane), etc. If LW is an attempt to increase sanity, this is among the lower hanging fruit. It’s like someone participating on a website about advanced math, while insisting that 2+2=5, and people saying “well, I don’t agree, but it would be rude to publicly call them wrong”.
But I can’t talk for cousin_it, and maybe we are concerned with completely different things.
I personally can’t remember anybody saying “God is love” on LessWrong. On the other hand, I read recently of people updating in the direction that kabbalistic wisdom might not be completely bogus after reading Unsong.
Scott has this creepy mental skill where he could steelman a long string of random ones and zeroes, and some people would believe it contains the deepest secret to the universe.
I’d like to imagine that Scott is doing this to create a control group for his usual articles. By comparing how many people got convinced by his serious articles and how many people got convinced by his attempts to steelman nonsense, he can evaluate whether people agree with him because of his ideas or because of his hypnotic writing. :D
I guess the definition of “rationalist community” has grown too far, and now means more or less “anyone who seems smart and either pays lip service to reason or is a friend with the right people”.
Is it really that bad? I haven’t noticed, but perhaps I was not paying enough attention, or my unconsciousness was trying to protect me by filtering out the most horrible things.
In case you only meant websites other than LW, I guess the definition of “rationalist community” has grown too far, and now means more or less “anyone who seems smart and either pays lip service to reason or is a friend with the right people”.
Not sure what conclusion should I make on this. I always felt wrong about censoring dissenters, and I still kinda do, but sometimes tolerating one smart religious person or one smart politically mindkilled person is all it takes to move the Overton window towards tolerating bullshit per se (as opposed to merely tolerating that this one specific smart person also believes some bullshit).
I’d like to see LessWrong 2.0 adopting zero-tolerance policy against politics and religion. I guess I can dream.
Equations like “productivity equals intelligence plus joy minus square root of area under hyperbole of your procrastination” feel like self-help with rationality as attire.
But there is also some boring advice like: “pomodoros seem to help most people”.
In good old fashioned tradition, we might start with tabooing religion. I don’t think cousin_it has a problem with having smart religious people on LessWrong. He would likely prefer it if Ilya would still participate on LessWrong. I think his concern is rather about a project like Dragon Army copying structures from religious organizations and the LessWrong community having solstice celebrations filled with ritual.
You’re right on both counts. Ilya is awesome, and rationalist versions of religious activities feel creepy to me.
I agree that there are different things one can possibly dislike about religion, and it would be better to be more precise.
For me, the annoying aspects are applying double standards of evidence (it would be wrong to blindly believe what random Joe says about theory of relativity, but it is perfectly okay and actually desirable to blindly believe what random Joe said a few millenia ago about the beginning of universe), speaking incoherent sentences (e.g. “god is love”), twisting one’s logic and morality to fit the predetermined bottom line (a smart and powerful being who decides that billions of people need to suffer and die because someone stole a fucking apple from his garden is still somehow praised as loving and sane), etc. If LW is an attempt to increase sanity, this is among the lower hanging fruit. It’s like someone participating on a website about advanced math, while insisting that 2+2=5, and people saying “well, I don’t agree, but it would be rude to publicly call them wrong”.
But I can’t talk for cousin_it, and maybe we are concerned with completely different things.
I personally can’t remember anybody saying “God is love” on LessWrong. On the other hand, I read recently of people updating in the direction that kabbalistic wisdom might not be completely bogus after reading Unsong.
Scott has this creepy mental skill where he could steelman a long string of random ones and zeroes, and some people would believe it contains the deepest secret to the universe.
I’d like to imagine that Scott is doing this to create a control group for his usual articles. By comparing how many people got convinced by his serious articles and how many people got convinced by his attempts to steelman nonsense, he can evaluate whether people agree with him because of his ideas or because of his hypnotic writing. :D
If you really think that you should add the definition here: https://wiki.lesswrong.com/wiki/Rationalist_movement