So you agree that yes, intelligence is continually generating “extra problems” for us to deal with. As you point out, many of the most pressing problems in the modern world are unforeseen consequences of useful technologies. You just believe that increases in human intelligence will invariably outpace the destructive power of the problems, whereas I don’t.
The premise of this diary was many earths, so I’d submit that certainly there are many earths for which the problem of nuclear warfare outpaced humanity’s capacity to intelligently deal with it, and that in the end we could very well share their fate.
I’ll also note that I fail to see how anyone could conclude from what I’ve written above that my prescription for humanity is stupid pills.
I’m glad we’ve hashed this out. I think that bias about the messianic/apocalyptic role of technology has largely been overlooked on this site, so I was glad to see this entry of Eliezer’s.
Regardless of whether or not they’re true I tend to think that arguments about the arc of history etc are profoundly counterproductive. People won’t vote if they think it’s a landslide, either for their guy or against. And I suspect I differ from others on this site in this respect, but I find it hard to get ginned up about cosmic endeavors, simply because they seem so remote from my experience.
And I don’t think we need an alternative! What I was trying to point out from the start was that increasing our predictive ability is necessary but not sufficient to save the world. Entirely selfish, entirely rational actors will doom the planet if we let them.
So you agree that yes, intelligence is continually generating “extra problems” for us to deal with. As you point out, many of the most pressing problems in the modern world are unforeseen consequences of useful technologies. You just believe that increases in human intelligence will invariably outpace the destructive power of the problems, whereas I don’t.
The premise of this diary was many earths, so I’d submit that certainly there are many earths for which the problem of nuclear warfare outpaced humanity’s capacity to intelligently deal with it, and that in the end we could very well share their fate.
I’ll also note that I fail to see how anyone could conclude from what I’ve written above that my prescription for humanity is stupid pills.
I agree completely. If intelligence-generated problems cannot outpace the solutions total destruction awaits.
I apologize if the stupid pill characterization feels wrong, I just was trying to think of a viable alternative to increasing intelligence.
I’m glad we’ve hashed this out. I think that bias about the messianic/apocalyptic role of technology has largely been overlooked on this site, so I was glad to see this entry of Eliezer’s.
Regardless of whether or not they’re true I tend to think that arguments about the arc of history etc are profoundly counterproductive. People won’t vote if they think it’s a landslide, either for their guy or against. And I suspect I differ from others on this site in this respect, but I find it hard to get ginned up about cosmic endeavors, simply because they seem so remote from my experience.
And I don’t think we need an alternative! What I was trying to point out from the start was that increasing our predictive ability is necessary but not sufficient to save the world. Entirely selfish, entirely rational actors will doom the planet if we let them.