What do you think that I think that it means? That post is one of many that are implicitly used to reject any criticism about AI going FOOM. I especially thought about you and your awkward responses about predictions and falsification. In and of itself I agree with the post, but used selectively it is a soldier against unwanted criticism. Reading the last few posts of the sequences rerun made me again more confident that the whole purpose of this project is to brainwash people into buying the AI FOOM idea.
I doubt that’s the main point of the project, i hope i would know as i have lurked it in great detail since it was first envisioned. That being said i agree that wedrifid’s answer is surprisingly terse.
Well, Eliezer Yudkowsky is making a living telling people that they ought to donate money to his “charity”.
Almost a year ago I posted my first submission here. I have been aware of OB/LW for longer than that but didn’t care until I read about the Roko incident. That made me really curious.
I am too tired to go into any detail right now, but what I learnt since then didn’t make me particularly confident of the epistemic standard of LW, despite the solemn assertion of its members.
The short version, the assurance that you are an aspiring rationalist might mislead some people to assign some credence to your extraordinary claims, but it won’t make them less wrong.
There are many reasons for why I am skeptic. As I said in another comment above, reading some of the posts linked to by the sequences rerun made Eliezer Yudkowsky much more untrustworthy in my opinion. Before I thought that some of his statements, e.g. “If you don’t sign up your kids for cryonics then you are a lousy parent.”, are negligible lapses of sanity, but it now appears to me that such judgmental statements are the rule.
I think i understand your point of view and i agree with your sentiments, but do you honestly believe that Eliezer does this all for the money? I think that he likes being able to spend all his time working on this and the singularity institute definitely treats him well but the majority of people on less wrong including him really do want to save the world from what I’ve seen.
As for his statement about cryonics, if hes passive about i don’t think many of the lurkers would consider signing up. Cryonics seems like a long shot to me but i think its reasonable to assume that he writes so emotionally about it because he honestly just wants more people to be vitrified in case we do manage to create an FAI.
I would love to hear more about your reasons for skepticism because i share many of the same concerns, but so far Ive hear lo to the contrary wisdom on LW/OB.
i agree that wedrifid’s answer is surprisingly terse
Not surprisingly, to those who have experience with wedrifid. Merely annoyingly. Though in this case he is making an allusion to a well known trope. Google on the phrase “does not mean what you think it means”. If XiXiDu has referenced that “You’re Entitled …” posting before, for roughly the same debunking purpose, then wedrifid’s terse putdown strikes me as rather clever.
What do you think that I think that it means? That post is one of many that are implicitly used to reject any criticism about AI going FOOM. I especially thought about you and your awkward responses about predictions and falsification. In and of itself I agree with the post, but used selectively it is a soldier against unwanted criticism. Reading the last few posts of the sequences rerun made me again more confident that the whole purpose of this project is to brainwash people into buying the AI FOOM idea.
I doubt that’s the main point of the project, i hope i would know as i have lurked it in great detail since it was first envisioned. That being said i agree that wedrifid’s answer is surprisingly terse.
Well, Eliezer Yudkowsky is making a living telling people that they ought to donate money to his “charity”.
Almost a year ago I posted my first submission here. I have been aware of OB/LW for longer than that but didn’t care until I read about the Roko incident. That made me really curious.
I am too tired to go into any detail right now, but what I learnt since then didn’t make me particularly confident of the epistemic standard of LW, despite the solemn assertion of its members.
The short version, the assurance that you are an aspiring rationalist might mislead some people to assign some credence to your extraordinary claims, but it won’t make them less wrong.
There are many reasons for why I am skeptic. As I said in another comment above, reading some of the posts linked to by the sequences rerun made Eliezer Yudkowsky much more untrustworthy in my opinion. Before I thought that some of his statements, e.g. “If you don’t sign up your kids for cryonics then you are a lousy parent.”, are negligible lapses of sanity, but it now appears to me that such judgmental statements are the rule.
I think i understand your point of view and i agree with your sentiments, but do you honestly believe that Eliezer does this all for the money? I think that he likes being able to spend all his time working on this and the singularity institute definitely treats him well but the majority of people on less wrong including him really do want to save the world from what I’ve seen. As for his statement about cryonics, if hes passive about i don’t think many of the lurkers would consider signing up. Cryonics seems like a long shot to me but i think its reasonable to assume that he writes so emotionally about it because he honestly just wants more people to be vitrified in case we do manage to create an FAI. I would love to hear more about your reasons for skepticism because i share many of the same concerns, but so far Ive hear lo to the contrary wisdom on LW/OB.
Not surprisingly, to those who have experience with wedrifid. Merely annoyingly. Though in this case he is making an allusion to a well known trope. Google on the phrase “does not mean what you think it means”. If XiXiDu has referenced that “You’re Entitled …” posting before, for roughly the same debunking purpose, then wedrifid’s terse putdown strikes me as rather clever.
Some context.