[...] many reasons to doubt [...] belief system of a cult [...] haphazard musings of a high school dropout [...] never written a single computer program [...] professes to be an expert [...] crying chicken little [...] only a handful take the FAI idea seriously.
[...] dogma [...] ignore the uncertainties at every step [...] starting a church [...] religious thinking wrapped up to look like rationality.
I am unable to take this criticism seriously. It’s just a bunch of ad hominem and hand-waving. What are the reasons to doubt? How are they ignoring the uncertainties when they list them on their webpage and bring them up in every interview? How is a fiercely atheist group religious at all? How is it a cult (there are lots of posts about this in the LessWrong archive)? How is it irrational?
Edit: And I’m downvoted. You actually think a reply that’s 50% insult and emotionally loaded language has substance that I should be engaging with? I thought it was a highly irrational response on par with anti-cryonics writing of the worst order. Maybe you should point out the constructive portion.
The response by this individual seems like a summary, rather than an argument. The fact that someone writes a polemical summary of their views on a subject doesn’t tell us much about whether their views are well-reasoned or not. A polemical summary is consistent with being full of hot air, but it’s also consistent with having some damning arguments.
Of course, to know either way, we would have to hear this person’s actual arguments, which we haven’t, in this case.
How are they ignoring the uncertainties when they list them on their webpage and bring them up in every interview?
Just because a certain topic is raised, doesn’t mean that it is discussed correctly.
How is a fiercely atheist group religious at all?
The argument is that their thinking has some similarities to religion. It’s a common rhetorical move to compare any alleged ideology to religion, even if that ideology is secular.
How is it a cult (there are lots of posts about this in the LessWrong archive)?
The fact that EY displays an awareness of cultish dynamics doesn’t necessarily mean that SIAI avoids them. Personally, I buy most of Eliezer’s discussion that “every cause wants to become a cult,” and I don’t like the common practice of labeled movements as “cults.” The net for “cult” is being drawn far too widely.
Yet I wouldn’t say that the use of the word “cult” means that the individual is engaging in bad reasoning. While I think “cult” is generally a misnomer, it’s generally used as short-hand for a group having certain problematic social-psychological qualities (e.g. conformity, obedience to authority). The individual could well be able to back those criticisms up. Who knows.
We would need to hear this individual’s actual arguments to be able to evaluate whether the polemical summary is well-founded.
P.S. I wasn’t the one who downvoted you.
Edit:
high school dropout, who has never written a single computer program
I don’t know the truth of these statements. The second one seems dubious, but it might not be meant to be taken literally (“Hello World” is a program). If Eliezer isn’t a high school dropout, and has written major applications, then the credibility of this writer is lowered.
I believe you weren’t supposed to engage that reply, which is a dismissal more than criticism. I believe you were supposed to take a step back and use it as a hint as to why the SIAI’s yearly budget is 5 x 10^5 rather than 5 x 10^9 USD.
The END OF THE WORLD acts as a superstimulus to human fear mechanisms—and causes caring people rush to warn their friends of the impending DOOM—spreading the panic virally. END OF THE WORLD cults typically act by simulating this energy—and then feeding from it. The actual value of p(DOOM) is not particularly critical for all this.
The net effect on society of the FEARMONGERING that usually results from such organisations seems pretty questionable. Some of those who become convinced that THE END IS NIGH may try and prevent it - but others will neglect their future plans, and are more likely to rape and pillage.
There is, of course, one DOOM scenario (ok, one other DOOM scenario) which is entirely respectable here—that the earth will be engulfed when the sun becomes a red giant.
That fate for the planet haunted me when I was a kid. People would say “But that’s billions of years in the future” and I’d feel as though they were missing the point. It’s possible that a more detailed discussion would have helped....
Recently, I’ve read that school teachers have a standard answer for kids who are troubled by the red giant scenario [1]-- that people will have found a solution by then.
This seems less intellectually honest than “The human race will be long gone anyway”, but not awful. I think the most meticulous answer (aside from “that’s the far future and there’s nothing to be done about it now”) is “that’s so far in the future that we don’t know whether people will be around, but if they are, they may well find a solution.”
[1] I count this as evidence for the Flynn Effect.
I am unable to take this criticism seriously. It’s just a bunch of ad hominem and hand-waving. What are the reasons to doubt? How are they ignoring the uncertainties when they list them on their webpage and bring them up in every interview? How is a fiercely atheist group religious at all? How is it a cult (there are lots of posts about this in the LessWrong archive)? How is it irrational?
Edit: And I’m downvoted. You actually think a reply that’s 50% insult and emotionally loaded language has substance that I should be engaging with? I thought it was a highly irrational response on par with anti-cryonics writing of the worst order. Maybe you should point out the constructive portion.
The response by this individual seems like a summary, rather than an argument. The fact that someone writes a polemical summary of their views on a subject doesn’t tell us much about whether their views are well-reasoned or not. A polemical summary is consistent with being full of hot air, but it’s also consistent with having some damning arguments.
Of course, to know either way, we would have to hear this person’s actual arguments, which we haven’t, in this case.
Just because a certain topic is raised, doesn’t mean that it is discussed correctly.
The argument is that their thinking has some similarities to religion. It’s a common rhetorical move to compare any alleged ideology to religion, even if that ideology is secular.
The fact that EY displays an awareness of cultish dynamics doesn’t necessarily mean that SIAI avoids them. Personally, I buy most of Eliezer’s discussion that “every cause wants to become a cult,” and I don’t like the common practice of labeled movements as “cults.” The net for “cult” is being drawn far too widely.
Yet I wouldn’t say that the use of the word “cult” means that the individual is engaging in bad reasoning. While I think “cult” is generally a misnomer, it’s generally used as short-hand for a group having certain problematic social-psychological qualities (e.g. conformity, obedience to authority). The individual could well be able to back those criticisms up. Who knows.
We would need to hear this individual’s actual arguments to be able to evaluate whether the polemical summary is well-founded.
P.S. I wasn’t the one who downvoted you.
Edit:
I don’t know the truth of these statements. The second one seems dubious, but it might not be meant to be taken literally (“Hello World” is a program). If Eliezer isn’t a high school dropout, and has written major applications, then the credibility of this writer is lowered.
I believe you weren’t supposed to engage that reply, which is a dismissal more than criticism. I believe you were supposed to take a step back and use it as a hint as to why the SIAI’s yearly budget is 5 x 10^5 rather than 5 x 10^9 USD.
Re: “How is it a cult?”
It looks a lot like an END OF THE WORLD cult. That is a well-known subspecies of cult—e.g. see:
http://en.wikipedia.org/wiki/Doomsday_cult
“The End of the World Cult”
http://www.youtube.com/watch?v=-3uDmyGq8Ok
The END OF THE WORLD acts as a superstimulus to human fear mechanisms—and causes caring people rush to warn their friends of the impending DOOM—spreading the panic virally. END OF THE WORLD cults typically act by simulating this energy—and then feeding from it. The actual value of p(DOOM) is not particularly critical for all this.
The net effect on society of the FEARMONGERING that usually results from such organisations seems pretty questionable. Some of those who become convinced that THE END IS NIGH may try and prevent it - but others will neglect their future plans, and are more likely to rape and pillage.
My “DOOM” video has more—http://www.youtube.com/watch?v=kH31AcOmSjs
Slight sidetrack:
There is, of course, one DOOM scenario (ok, one other DOOM scenario) which is entirely respectable here—that the earth will be engulfed when the sun becomes a red giant.
That fate for the planet haunted me when I was a kid. People would say “But that’s billions of years in the future” and I’d feel as though they were missing the point. It’s possible that a more detailed discussion would have helped....
Recently, I’ve read that school teachers have a standard answer for kids who are troubled by the red giant scenario [1]-- that people will have found a solution by then.
This seems less intellectually honest than “The human race will be long gone anyway”, but not awful. I think the most meticulous answer (aside from “that’s the far future and there’s nothing to be done about it now”) is “that’s so far in the future that we don’t know whether people will be around, but if they are, they may well find a solution.”
[1] I count this as evidence for the Flynn Effect.
Downvoted for this.