There are many FAI failure modes that don’t involve gratuitous torture. I think ‘could’ is justified here, especially in comparison to Bioweapon catastrophe.
Right, there are plenty of failure modes, some less unpleasant than others, some are probably horrific beyond our worst nightmares. I suspect that any particular set of scenarios that we find comforting would have measure zero in the space of possible outcomes. If so, preferring death by AI over death by a bioweapon is but a failure of imagination.
There are many FAI failure modes that don’t involve gratuitous torture. I think ‘could’ is justified here, especially in comparison to Bioweapon catastrophe.
Right, there are plenty of failure modes, some less unpleasant than others, some are probably horrific beyond our worst nightmares. I suspect that any particular set of scenarios that we find comforting would have measure zero in the space of possible outcomes. If so, preferring death by AI over death by a bioweapon is but a failure of imagination.
It doesn’t take much comfort to beat a bioweapon that actually succeeds in killing everyone.
Simply using our atoms to make paperclips, and being quick about it, wins.