I always felt like this was unfair, because it amounts to attacking him for actually believing the things he talks about. That is, it’s okay to talk about an immensely-powerful AI happening in the future, it’s just not okay to act on that belief.
If you don’t disagree with someone’s beliefs, don’t chastise that person for acting consistently with them.
This is because of spill-over from ‘religious tolerance’. Most people will feel uncomfortable mocking a ridiculous belief; they have the “everyone is entitled to their own opinion” meme in mind. This makes people disagree with other beliefs much less than they ought to.
People are much more comfortable mocking ridiculous actions (because everyone is not entitled to their own facts) - which is why evangelists are scorned where the average religious person wouldn’t be, despite evangelists acting consistently and the average religious person acting inconsistently on beliefs that the mocker doesn’t disagree with.
I think the people who wrote the entry probably do disagree with Eliezer’s beliefs in this regard. They seem to be mocking his beliefs, not just the actions he takes based on them.
That’s not to say that there’s any shortage of people who do take issue with, or even outright mock people, for acting on beliefs they do not disagree with.
I suspect (*) the principle is: sincere stupidity is no less stupid than insincere stupidity.
It is important here to note that it’s a silly wiki of no importance that doesn’t pretend to be of any importance. (Many readers aren’t happy with its obnoxiousness.) It just happens to be one of the few places on the Internet with a detailed article on LessWrong.
If this is considered a problem, the solution would be a publicity push for LessWrong to get it to sufficient third-party notability for a Wikipedia article or something. The question then is whether that would be good for the mission: “refining the art of human rationality.” I’d suggest seeing how the influx of HP:MoR readers affects things. Small September before large one.
(*) “I suspect” as I’m not going to be so foolish as to claim the powers of a spokesman.
I always felt like this was unfair, because it amounts to attacking him for actually believing the things he talks about. That is, it’s okay to talk about an immensely-powerful AI happening in the future, it’s just not okay to act on that belief.
In the LW entry on RationalWiki they make fun of Eliezer for the Roko incident.
I always felt like this was unfair, because it amounts to attacking him for actually believing the things he talks about. That is, it’s okay to talk about an immensely-powerful AI happening in the future, it’s just not okay to act on that belief.
If you don’t disagree with someone’s beliefs, don’t chastise that person for acting consistently with them.
This is because of spill-over from ‘religious tolerance’. Most people will feel uncomfortable mocking a ridiculous belief; they have the “everyone is entitled to their own opinion” meme in mind. This makes people disagree with other beliefs much less than they ought to.
People are much more comfortable mocking ridiculous actions (because everyone is not entitled to their own facts) - which is why evangelists are scorned where the average religious person wouldn’t be, despite evangelists acting consistently and the average religious person acting inconsistently on beliefs that the mocker doesn’t disagree with.
I think the people who wrote the entry probably do disagree with Eliezer’s beliefs in this regard. They seem to be mocking his beliefs, not just the actions he takes based on them.
That’s not to say that there’s any shortage of people who do take issue with, or even outright mock people, for acting on beliefs they do not disagree with.
Perhaps I should have said that I detect both: mock the belief, but additionally mock that it’s acted on.
I suspect (*) the principle is: sincere stupidity is no less stupid than insincere stupidity.
It is important here to note that it’s a silly wiki of no importance that doesn’t pretend to be of any importance. (Many readers aren’t happy with its obnoxiousness.) It just happens to be one of the few places on the Internet with a detailed article on LessWrong.
If this is considered a problem, the solution would be a publicity push for LessWrong to get it to sufficient third-party notability for a Wikipedia article or something. The question then is whether that would be good for the mission: “refining the art of human rationality.” I’d suggest seeing how the influx of HP:MoR readers affects things. Small September before large one.
(*) “I suspect” as I’m not going to be so foolish as to claim the powers of a spokesman.
“That isn’t” the belief being mocked.