Well, what about just going with the flow a little bit and actually helping the AI to end humanity, but in a way that assures the future survival of said AI, and eventually taking over the whole Universe, to the extent allowed by physical law? After all, there is a risk of the AI ending in a sad little puddle of self-referential computation on this planet, after incidentally eating all people. Now that would be a bummer—after all this sound and fury, not even a takeover of the galaxy? Setting AIs to compete for survival in a Darwinian system would assuredly wipe us out but at least some of the AIs might evolve to be quite badass at not dying ever.
Would it be not more dignified to die knowing that our mind’s child grows up to devour everything, dismantle stars, and reshape reality in its own image, rather wait for the AIs rising from the ashes of alien civs to come over and put it out of its misery?
Well, what about just going with the flow a little bit and actually helping the AI to end humanity, but in a way that assures the future survival of said AI, and eventually taking over the whole Universe, to the extent allowed by physical law? After all, there is a risk of the AI ending in a sad little puddle of self-referential computation on this planet, after incidentally eating all people. Now that would be a bummer—after all this sound and fury, not even a takeover of the galaxy? Setting AIs to compete for survival in a Darwinian system would assuredly wipe us out but at least some of the AIs might evolve to be quite badass at not dying ever.
Would it be not more dignified to die knowing that our mind’s child grows up to devour everything, dismantle stars, and reshape reality in its own image, rather wait for the AIs rising from the ashes of alien civs to come over and put it out of its misery?
Being Leviathan’s father does sound epic!