BTW, a potential bias you should be aware of in this situation is the human tendency to be irrationally inclined to go through with things once they said they’re going to do them. (I believe Robert Cialdini’s Influence: Science and Practice talks about this.) So you might want to consider self-observing and trying to detect if that bias is having any influence on your thought process. I (and, probably, all of the kind folks at SIAI—although of course I can’t speak for them) will completely forgive you if you go back on your public statements on this. Speaking for myself individually, I’d see this as a demonstration of virtue.
And just to be a little silly, I’ll use another technique from Influence on you: reciprocation. When I read that you didn’t think computer science would be fundamental to the development of strong AI, I immediately thought “That can’t be right”. I had a very strong gut feeling that somehow, computer science must be fundamental to the development of strong AI and I immediately starting trying to find a reason for why it was. (It seems Vladimir Nesov’s reaction was very similar to mine, and note that he didn’t find much of a reason. My guess is his comment’s high score is a result of many LW readers sharing his and my gut instinct.) However, I noticed that my mind had entered one of its failure modes (motivated continuation) and I thought to myself “Well, I don’t have any solid argument now for why computer science must be fundamental, and there’s no real reason for me to look for an argument in favor of that idea instead of an argument against it.” So now I’ve publicly admitted that my gut instinct was unfounded and that my mind is broken; maybe using the Dark Technique of trying to get you to reciprocate will convince you to do the same. :P
To people who think I’m bringing about doomsday: if my ideas are substantively right, it’s going to take a long time before this stuff gets rolling. It will take a decade just to convince the mainstream scientific establishment. After that, things might speed up, but it’s still going to be a long, hard slog. Did I mention I have only a good question, not an answer? Let’s all take some deep breaths.
I believe Eliezer is a member of the school of thought which holds that the intelligence explosion could potentially be triggered by nine geniuses working together in a basement.
Note that I attacked a flaw in the argument (usage of analogy that assumes that computer science is about computers), and never said anything about the implied conclusion (that computer science is irrelevant for AI). And this does reflect my reaction.
BTW, a potential bias you should be aware of in this situation is the human tendency to be irrationally inclined to go through with things once they said they’re going to do them. (I believe Robert Cialdini’s Influence: Science and Practice talks about this.) So you might want to consider self-observing and trying to detect if that bias is having any influence on your thought process. I (and, probably, all of the kind folks at SIAI—although of course I can’t speak for them) will completely forgive you if you go back on your public statements on this. Speaking for myself individually, I’d see this as a demonstration of virtue.
And just to be a little silly, I’ll use another technique from Influence on you: reciprocation. When I read that you didn’t think computer science would be fundamental to the development of strong AI, I immediately thought “That can’t be right”. I had a very strong gut feeling that somehow, computer science must be fundamental to the development of strong AI and I immediately starting trying to find a reason for why it was. (It seems Vladimir Nesov’s reaction was very similar to mine, and note that he didn’t find much of a reason. My guess is his comment’s high score is a result of many LW readers sharing his and my gut instinct.) However, I noticed that my mind had entered one of its failure modes (motivated continuation) and I thought to myself “Well, I don’t have any solid argument now for why computer science must be fundamental, and there’s no real reason for me to look for an argument in favor of that idea instead of an argument against it.” So now I’ve publicly admitted that my gut instinct was unfounded and that my mind is broken; maybe using the Dark Technique of trying to get you to reciprocate will convince you to do the same. :P
I believe Eliezer is a member of the school of thought which holds that the intelligence explosion could potentially be triggered by nine geniuses working together in a basement.
By the nether gods… IT ALL MAKES SENSE NOW
Note that I attacked a flaw in the argument (usage of analogy that assumes that computer science is about computers), and never said anything about the implied conclusion (that computer science is irrelevant for AI). And this does reflect my reaction.
Oh, sorry, I missed that.