I think the DRH quote is pretty out of context, and Eliezer’s commentary on it is pretty unfair. DRH has a deeply personal respect for human intelligence. He doesn’t look forward to the singularity because he (correctly) points out that it will be the end of humanity. Most SI/LessWrong people accept that and look forward to it, but for Hofstadter the current view of the singularity is an extremely pessimistic view of the future. Note that this is simply a result of his personal beliefs. He never claims that people are wrong to look forward to superintelligence, brain emulation and things like that, just that he doesn’t. See this interview for his thoughts on the subject.
I think the DRH quote is pretty out of context, and Eliezer’s commentary on it is pretty unfair. DRH has a deeply personal respect for human intelligence. He doesn’t look forward to the singularity because he (correctly) points out that it will be the end of humanity. Most SI/LessWrong people accept that and look forward to it, but for Hofstadter the current view of the singularity is an extremely pessimistic view of the future. Note that this is simply a result of his personal beliefs. He never claims that people are wrong to look forward to superintelligence, brain emulation and things like that, just that he doesn’t. See this interview for his thoughts on the subject.