Yeah, this is a better explanation than my post has. There were definitely multiple factors.
One aspect of tractability of these sorts of coordination problems that makes it different from the tractability of problems in everyday life: I don’t think people largely “expect” their government to solve pandemic preparedness. It seems like something that can’t be solved, to the average voter. Whereas there’s pretty much a “zero-tolerance policy” (?) on nuclear meltdowns because that seems to most people like something that should never happen. So it’s not necessarily about the problem being solvable in a traditional sense, more about the tendency of the public to blame their government officials when things go wrong.
I predict the instinct of the public if “something goes wrong” with AGI will be to say “this should never happen, the government needs to Do Something”, which in practice will mean blaming the companies involved and completely hampering their ability to publish or complete relevant research.
Yeah, this is a better explanation than my post has. There were definitely multiple factors.
One aspect of tractability of these sorts of coordination problems that makes it different from the tractability of problems in everyday life: I don’t think people largely “expect” their government to solve pandemic preparedness. It seems like something that can’t be solved, to the average voter. Whereas there’s pretty much a “zero-tolerance policy” (?) on nuclear meltdowns because that seems to most people like something that should never happen. So it’s not necessarily about the problem being solvable in a traditional sense, more about the tendency of the public to blame their government officials when things go wrong.
I predict the instinct of the public if “something goes wrong” with AGI will be to say “this should never happen, the government needs to Do Something”, which in practice will mean blaming the companies involved and completely hampering their ability to publish or complete relevant research.