I think the natural/manmade comparison between COVID and Three Mile has alot of merit but there are other differences which might explain the difference. Some of them would imply that there would be a strong response to an AI , others less so.
Local vs global
To prevent nuclear meltdowns you only need to ban them in the US—it doesn’t matter what you do elsewhere. This is more complicated for pandemic preparedness.
Active spending vs loss of growth
Its easier to pass a law putting in nuclear regulations which limit growth as this isn’t as obvious a loss as spending money from the public purse on measures for pandemics.
Activity of lobbying groups
I get the impression that the anti-nuclear lobby was alot bigger than any pro-pandemic-preparedness lobby. Possibly this is partly caused by the natural vs manmade thing so might be kind of a subpoint.
Tractability of problem
Preventing nuclear disasters seems more tractable than pandemic preparedness
1979 vs 2020
Were our institutions stronger back then?
FWIW I agree that a large AI disaster would cause some strong regulation and international agreements, my concern is more that a small one would not and small ones from weaker AIs seem more likely to happen.
Yeah, this is a better explanation than my post has. There were definitely multiple factors.
One aspect of tractability of these sorts of coordination problems that makes it different from the tractability of problems in everyday life: I don’t think people largely “expect” their government to solve pandemic preparedness. It seems like something that can’t be solved, to the average voter. Whereas there’s pretty much a “zero-tolerance policy” (?) on nuclear meltdowns because that seems to most people like something that should never happen. So it’s not necessarily about the problem being solvable in a traditional sense, more about the tendency of the public to blame their government officials when things go wrong.
I predict the instinct of the public if “something goes wrong” with AGI will be to say “this should never happen, the government needs to Do Something”, which in practice will mean blaming the companies involved and completely hampering their ability to publish or complete relevant research.
I think the natural/manmade comparison between COVID and Three Mile has alot of merit but there are other differences which might explain the difference. Some of them would imply that there would be a strong response to an AI , others less so.
Local vs global
To prevent nuclear meltdowns you only need to ban them in the US—it doesn’t matter what you do elsewhere. This is more complicated for pandemic preparedness.
Active spending vs loss of growth
Its easier to pass a law putting in nuclear regulations which limit growth as this isn’t as obvious a loss as spending money from the public purse on measures for pandemics.
Activity of lobbying groups
I get the impression that the anti-nuclear lobby was alot bigger than any pro-pandemic-preparedness lobby. Possibly this is partly caused by the natural vs manmade thing so might be kind of a subpoint.
Tractability of problem
Preventing nuclear disasters seems more tractable than pandemic preparedness
1979 vs 2020
Were our institutions stronger back then?
FWIW I agree that a large AI disaster would cause some strong regulation and international agreements, my concern is more that a small one would not and small ones from weaker AIs seem more likely to happen.
Yeah, this is a better explanation than my post has. There were definitely multiple factors.
One aspect of tractability of these sorts of coordination problems that makes it different from the tractability of problems in everyday life: I don’t think people largely “expect” their government to solve pandemic preparedness. It seems like something that can’t be solved, to the average voter. Whereas there’s pretty much a “zero-tolerance policy” (?) on nuclear meltdowns because that seems to most people like something that should never happen. So it’s not necessarily about the problem being solvable in a traditional sense, more about the tendency of the public to blame their government officials when things go wrong.
I predict the instinct of the public if “something goes wrong” with AGI will be to say “this should never happen, the government needs to Do Something”, which in practice will mean blaming the companies involved and completely hampering their ability to publish or complete relevant research.