It is obvious that a number of smart people have decided that SIAI is currently the most important cause to devote their time and money to. This in itself constitutes an extremely strong form of evidence. This is, or at least was, basically Eliezer’s blog; if the thing that unites its readers is respect for his intelligence and judgment, then you should be completely unsurprised to see that many support SIAI. It is not clear how this is a form of irrationality, unless you are claiming that the facts are so clearly against the SIAI that we should be interpreting them as evidence against the intelligence of supporters of the SIAI.
Someone who is trying to have an effect on the course of an intelligence explosion is more likely to than someone who isn’t. I think many readers (myself included) believe very strongly that an intelligence explosion is almost certainly going to happen eventually and that how it occurs will have a dominant influence on the future of humanity. I don’t know if the SIAI will have a positive, negative, or negligible influence, but based on my current knowledge all of these possibilities are still reasonably likely (where even 1% is way more than likely enough to warrant attention).
It is obvious that a number of smart people have decided that SIAI is currently the most important cause to devote their time and money to. This in itself constitutes an extremely strong form of evidence.
No. It isn’t very strong evidence by itself. Jonathan Sarfati is a chess master, published chemist, and a prominent young earth creationist. If we added all the major anti-evolutionists together it would easily include not just Sarfati but also William Dembski, Michael Behe, and Jonathan Wells, all of whom are pretty intelligent. There are some people less prominently involved who are also very smart such as Forrest Mims.
This is not the only example of this sort. In general, we live in a world where there are many, many smart people. That multiple smart people care about something can’t do much beyond locate the hypothesis. One distinction is that they most smart people who have looked at the SIAI have come away not thinking they are crazy, which is a very different situation from the sort of example given above, but by itself smart people having an interest is not strong evidence.
(Also, on a related note, see this subthread here which made it clear that what smart people think, even if one has a general consensus among smart people is not terribly reliable.)
I don’t really mean “smart” in the sense that a chess player proves their intelligence by being good at chess, or a mathematician proves their intelligence by being good at math. I mean smart in the sense of good at forming true beliefs and acting on them. If Nick Bostrom were to profess his belief that the world was created 6000 years ago, then I would say this constitutes reasonably strong evidence that the world was created 6000 years ago (when combined with existing evidence that Nick Bostrom is good at forming correct beliefs and reporting them honestly). Of course, there is much stronger evidence against this hypothesis (and it is extremely unlikely that I would have only Bostrom’s testimony—if he came to such a belief legitimately I would strongly expect there to be additional evidence he could present), so if he were to come out and say such a thing it would mostly just decrease my estimate of his intelligence rather than decreasing my estimate for the age of the Earth. The situation with SIAI is very different: I know of little convincing evidence bearing one way or the other on the question, and there are good reasons that intelligent people might not be able to produce easily understood evidence justifying their positions (since that evidence basically consists of a long thought process which they claim to have worked through over years).
Finally, though you didn’t object, I shouldn’t really have said “obvious.” There are definitely other plausible explanations for the observed behavior of SIAI supporters than their honest belief that it is the most important cause to support.
One distinction is that they most smart people who have looked at the SIAI have come away not thinking they are crazy
There is a strong selection effect. Most people won’t even look too closely, or comment on their observations. I’m not sure in what sense we can expect what you wrote to be correct.
This comment, on this post, in this blog, comes across as a textbook example of the Texas Sharpshooter Fallacy. You don’t form your hypothesis after you’ve looked at the data, just as you don’t prove what a great shot you are by drawing a target around the bullet hole.
You don’t form your hypothesis after you’ve looked at the data, just as you don’t prove what a great shot you are by drawing a target around the bullet hole.
I normally form hypotheses after I’ve looked at the data, although before placing high credence in them I would prefer to have confirmation using different data.
I agree that I made at least one error in that post (as in most things I write). But what exactly are you calling out?
I believe an intelligence explosion is likely (and have believed this for a good decade). I know the SIAI purports to try to positively influence an explosion. I have observed that some smart people are behind this effort and believe it is worth spending their time on. This is enough motivation for me to seriously consider how effective I think that the SIAI will be. It is also enough for me to question the claim that many people supporting SIAI is clear evidence of irrationality.
It is obvious that a number of smart people have decided that SIAI is currently the most important cause to devote their time and money to. This in itself constitutes an extremely strong form of evidence. This is, or at least was, basically Eliezer’s blog; if the thing that unites its readers is respect for his intelligence and judgment, then you should be completely unsurprised to see that many support SIAI. It is not clear how this is a form of irrationality, unless you are claiming that the facts are so clearly against the SIAI that we should be interpreting them as evidence against the intelligence of supporters of the SIAI.
Someone who is trying to have an effect on the course of an intelligence explosion is more likely to than someone who isn’t. I think many readers (myself included) believe very strongly that an intelligence explosion is almost certainly going to happen eventually and that how it occurs will have a dominant influence on the future of humanity. I don’t know if the SIAI will have a positive, negative, or negligible influence, but based on my current knowledge all of these possibilities are still reasonably likely (where even 1% is way more than likely enough to warrant attention).
Upvoting but nitpicking one aspect:
No. It isn’t very strong evidence by itself. Jonathan Sarfati is a chess master, published chemist, and a prominent young earth creationist. If we added all the major anti-evolutionists together it would easily include not just Sarfati but also William Dembski, Michael Behe, and Jonathan Wells, all of whom are pretty intelligent. There are some people less prominently involved who are also very smart such as Forrest Mims.
This is not the only example of this sort. In general, we live in a world where there are many, many smart people. That multiple smart people care about something can’t do much beyond locate the hypothesis. One distinction is that they most smart people who have looked at the SIAI have come away not thinking they are crazy, which is a very different situation from the sort of example given above, but by itself smart people having an interest is not strong evidence.
(Also, on a related note, see this subthread here which made it clear that what smart people think, even if one has a general consensus among smart people is not terribly reliable.)
There are several problems with what I said.
My use of “extremely” was unequivocally wrong.
I don’t really mean “smart” in the sense that a chess player proves their intelligence by being good at chess, or a mathematician proves their intelligence by being good at math. I mean smart in the sense of good at forming true beliefs and acting on them. If Nick Bostrom were to profess his belief that the world was created 6000 years ago, then I would say this constitutes reasonably strong evidence that the world was created 6000 years ago (when combined with existing evidence that Nick Bostrom is good at forming correct beliefs and reporting them honestly). Of course, there is much stronger evidence against this hypothesis (and it is extremely unlikely that I would have only Bostrom’s testimony—if he came to such a belief legitimately I would strongly expect there to be additional evidence he could present), so if he were to come out and say such a thing it would mostly just decrease my estimate of his intelligence rather than decreasing my estimate for the age of the Earth. The situation with SIAI is very different: I know of little convincing evidence bearing one way or the other on the question, and there are good reasons that intelligent people might not be able to produce easily understood evidence justifying their positions (since that evidence basically consists of a long thought process which they claim to have worked through over years).
Finally, though you didn’t object, I shouldn’t really have said “obvious.” There are definitely other plausible explanations for the observed behavior of SIAI supporters than their honest belief that it is the most important cause to support.
There is a strong selection effect. Most people won’t even look too closely, or comment on their observations. I’m not sure in what sense we can expect what you wrote to be correct.
This comment, on this post, in this blog, comes across as a textbook example of the Texas Sharpshooter Fallacy. You don’t form your hypothesis after you’ve looked at the data, just as you don’t prove what a great shot you are by drawing a target around the bullet hole.
I normally form hypotheses after I’ve looked at the data, although before placing high credence in them I would prefer to have confirmation using different data.
I agree that I made at least one error in that post (as in most things I write). But what exactly are you calling out?
I believe an intelligence explosion is likely (and have believed this for a good decade). I know the SIAI purports to try to positively influence an explosion. I have observed that some smart people are behind this effort and believe it is worth spending their time on. This is enough motivation for me to seriously consider how effective I think that the SIAI will be. It is also enough for me to question the claim that many people supporting SIAI is clear evidence of irrationality.
Yes, but here you’re using your data to support the hypothesis you’ve formed.
If I believe X and you ask me why I believe X, surely I will respond by providing you with the evidence that caused me to believe X?
External reality is not changed by the temporal location of hypothesis formation.
No, but when hypotheses are formed is relevant to evaluating their likelyhood given standard human cognitive biases.