I think we live in a world where there are very strong forces opposed to technological progress, which actively impede a lot of impactful work, including technologies which have the potential to be very economically and strategically important (e.g. nuclear power, vaccines, genetic engineering, geoengineering).
This observation doesn’t lead me to a strong prediction that all such technologies will be banned; nor even that the most costly technologies will be banned—if the forces opposed to technological progress were even approximately rational, then banning gain of function research would be one of their main priorities (although I note that they did manage to ban it, the ban just didn’t stick).
But when Eliezer points to covid as an example of generalised government failure, and I point to covid as also being an example of the specific phenomenon of people being very wary of new technology, I don’t think that my gloss is clearly absurd. I’m open to arguments that say that serious opposition to AI progress won’t be an important factor in how the future plays out; and I’m also open to arguments that covid doesn’t provide much evidence that there will be serious opposition to AI progress. But I do think that those arguments need to be made.
I think we live in a world where there are very strong forces opposed to technological progress, which actively impede a lot of impactful work, including technologies which have the potential to be very economically and strategically important (e.g. nuclear power, vaccines, genetic engineering, geoengineering).
This observation doesn’t lead me to a strong prediction that all such technologies will be banned; nor even that the most costly technologies will be banned—if the forces opposed to technological progress were even approximately rational, then banning gain of function research would be one of their main priorities (although I note that they did manage to ban it, the ban just didn’t stick).
But when Eliezer points to covid as an example of generalised government failure, and I point to covid as also being an example of the specific phenomenon of people being very wary of new technology, I don’t think that my gloss is clearly absurd. I’m open to arguments that say that serious opposition to AI progress won’t be an important factor in how the future plays out; and I’m also open to arguments that covid doesn’t provide much evidence that there will be serious opposition to AI progress. But I do think that those arguments need to be made.