Mirror cells and novel viruses are well within ‘boring’ advanced biotech, which can be quite dangerous. My argument of implausibility was directed at sci-fi hard nanotech, like grey goo.
If I had to guess at why these counterarguments fall apart, then it’s that unaligned AGI wouldn’t design a pandemic by mistake, because a germ capable of causing pandemics would have to specifically be designed for targetting human biology?
That seems plausible. The risk is that an unaligned AGI could kill or weaken humanity through advanced biotech. I don’t think this is the most plausible outcome of unaligned AGI; more likely it would instead just soft takeover the world without killing us. If it did kill humanity that would come later, but it probably wouldn’t need to.
Mirror cells and novel viruses are well within ‘boring’ advanced biotech, which can be quite dangerous. My argument of implausibility was directed at sci-fi hard nanotech, like grey goo.
That seems plausible. The risk is that an unaligned AGI could kill or weaken humanity through advanced biotech. I don’t think this is the most plausible outcome of unaligned AGI; more likely it would instead just soft takeover the world without killing us. If it did kill humanity that would come later, but it probably wouldn’t need to.