Personal protective equipment and isolation can protect against infectious disease, at the very least. A more deadly and infectious virus than COVID would be taken far more seriously.
I think nuclear war is unlikely to wipe out humanity, since there are enough countries that are unlikely targets, and I don’t think all of the US would be wiped out anyway. I’m less sure about nuclear winter, but those in the community who’ve done research on it seem skeptical that it would wipe us out. Maybe it reduces the population enough for an AGI to target the rest of us or prevent us from rebuilding, though.
Some posts here:
https://forum.effectivealtruism.org/topics/nuclear-warfare-1https://forum.effectivealtruism.org/topics/nuclear-winter
Maybe it reduces the population enough for an AGI to target the rest of us or prevent us from rebuilding, though.
Yeah, I’m familiar with the arguments that neither pandemics nor nuclear war seem likely to be existential risks, i.e. ones that could cause human extinction; but I’d nonetheless expect such events to be damaging enough from the perspective of a nefarious actor trying to prevent resistance.
Ultimately this whole line of reasoning seems superfluous to me—it just seems so obvious that with sufficient cognitive power one can do ridiculous things—but for those who trip up on the suggested nanotech stuff, maybe a more palatable argument is: You know those other x-risks you’re already worrying about? A sufficiently intelligent antagonist can exacerbate those nigh-arbitrarily.
Personal protective equipment and isolation can protect against infectious disease, at the very least. A more deadly and infectious virus than COVID would be taken far more seriously.
I think nuclear war is unlikely to wipe out humanity, since there are enough countries that are unlikely targets, and I don’t think all of the US would be wiped out anyway. I’m less sure about nuclear winter, but those in the community who’ve done research on it seem skeptical that it would wipe us out. Maybe it reduces the population enough for an AGI to target the rest of us or prevent us from rebuilding, though. Some posts here: https://forum.effectivealtruism.org/topics/nuclear-warfare-1 https://forum.effectivealtruism.org/topics/nuclear-winter
Yeah, I’m familiar with the arguments that neither pandemics nor nuclear war seem likely to be existential risks, i.e. ones that could cause human extinction; but I’d nonetheless expect such events to be damaging enough from the perspective of a nefarious actor trying to prevent resistance.
Ultimately this whole line of reasoning seems superfluous to me—it just seems so obvious that with sufficient cognitive power one can do ridiculous things—but for those who trip up on the suggested nanotech stuff, maybe a more palatable argument is: You know those other x-risks you’re already worrying about? A sufficiently intelligent antagonist can exacerbate those nigh-arbitrarily.