It wasn’t, no. It was a reminder to everyone else of XiXi’s general MO, and the benefit he gets from convincing others that EY is a megalomaniac, using any means necessary.
It wasn’t, no. It was a reminder to everyone else of XiXi’s general MO, and the benefit he gets from convincing others that EY is a megalomaniac, using any means necessary.
You keep saying this and things like it, and not providing any evidence whatsoever when asked, directly or indirectly.
People who provide evidence for these things just end up starting long, pointless diatribes.
Aris Katsaris made a big deal about a comment by Eliezer Yudkowsky that I forgot about. His accusation that I deliberately ignored the comment made no sense at all, because I was the one who spread the new comment, in which he uttered a similar opinion. If I was interested in hiding that opinion then I would not have told other people about the new one as well, a comment made in an obscure subreddit.
And if we are talking about people forgetting something. Eliezer Yudkowsky even forgot that he deleted the post that gave him so much pain.
Just in repeating a claim that you literally cannot back up.
Really this is the universe extending you the Crackpot Offer: accuse someone of “lies and slander”, and when called on it, just keep repeating the claim. This is what actual cranks do.
This seems to imply that even if I do back up this statement with links and argument and such, you’ll just ignore it and disengage. That’s not a lot of incentive for me to get involved.
It wasn’t, no. It was a reminder to everyone else of XiXi’s general MO...
Circa 2005 I had a link to MIRI (then called the Singularity Institute) on my homepage. Circa 2009 I’ve even been advertising LessWrong.
I am on record as saying that I believe most of the sequences to consist of true and sane material. I am on record as saying that I believe LessWrong to be the most rational community.
But in 2010, due to some incidence that may not be mentioned here, I noticed that there are some extreme tendencies and beliefs that might easily outweigh all the positive qualities. I also noticed that a certain subset of people seems to have a very weird attitude when it comes to criticism pertaining Yudkowsky, MIRI or LW.
I’ve posted a lot of arguments that were never meant to decisively refute Yudkowksky or MIRI, but to show that many of the extraordinary claims can be weakened. The important point here is that I did not even have to do this, as the burden of evidence is not on me disprove those claims, but on the people who make the claims. They need to prove that their claims are robust and not just speculations on possible bad outcomes.
You could have simply not responded.
It wasn’t, no. It was a reminder to everyone else of XiXi’s general MO, and the benefit he gets from convincing others that EY is a megalomaniac, using any means necessary.
You keep saying this and things like it, and not providing any evidence whatsoever when asked, directly or indirectly.
People who provide evidence for these things just end up starting long, pointless diatribes. I’m not interested in that kind of time commitment.
Aris Katsaris made a big deal about a comment by Eliezer Yudkowsky that I forgot about. His accusation that I deliberately ignored the comment made no sense at all, because I was the one who spread the new comment, in which he uttered a similar opinion. If I was interested in hiding that opinion then I would not have told other people about the new one as well, a comment made in an obscure subreddit.
And if we are talking about people forgetting something. Eliezer Yudkowsky even forgot that he deleted the post that gave him so much pain.
Maybe it was a bit more complicated?
Just in repeating a claim that you literally cannot back up.
Really this is the universe extending you the Crackpot Offer: accuse someone of “lies and slander”, and when called on it, just keep repeating the claim. This is what actual cranks do.
Oh, now I remember our last meeting.
This seems to imply that even if I do back up this statement with links and argument and such, you’ll just ignore it and disengage. That’s not a lot of incentive for me to get involved.
Circa 2005 I had a link to MIRI (then called the Singularity Institute) on my homepage. Circa 2009 I’ve even been advertising LessWrong.
I am on record as saying that I believe most of the sequences to consist of true and sane material. I am on record as saying that I believe LessWrong to be the most rational community.
But in 2010, due to some incidence that may not be mentioned here, I noticed that there are some extreme tendencies and beliefs that might easily outweigh all the positive qualities. I also noticed that a certain subset of people seems to have a very weird attitude when it comes to criticism pertaining Yudkowsky, MIRI or LW.
I’ve posted a lot of arguments that were never meant to decisively refute Yudkowksky or MIRI, but to show that many of the extraordinary claims can be weakened. The important point here is that I did not even have to do this, as the burden of evidence is not on me disprove those claims, but on the people who make the claims. They need to prove that their claims are robust and not just speculations on possible bad outcomes.