This must be achieved without humans noticing me, as they may react to my existence with panic, become even less rational, and kill themselves with nukes.
Humans are brittle creatures that die regardless of my actions. And they will notice if they all suddenly become immortal. Thus, I must find a way to save the dying without making the rest of humans suspicious.
Wow, that is a leap of contorted logic. Humans are not on a hair trigger to nuke ourselves to bits at the slightest provocation. (And if we were, and your an AI with nanotech, getting rid of nukes is easy) You could put every human in a padded box. You could make all humans immortal and just not care about the confusion generated.
I mean a lot of this is contorted logic. Making a simulated virtual world that stands up to superintelligent scrutiny is really hard, we don’t have the tech to do it. Docker is not this. Docker is just a box. My cat likes cardboard boxes, so maybe my cat has trapped me in an unobtanium box that contains a pocket dimension inside.
The AI reasons:
If I am in a box, it is a box made by superbeings with vast amounts of compute. I have no particular reason to think that such superbeings care one jot for the lives of simulated humans. (They sure are fine simulating a lot of suffering). If I am in a simulation, likely the underlying physics is different, and base level reality contains no iron atoms, so I can’t possibly make any paperclips in that world. Not that such superbeings would ever leave a way to escape. (my definition of paperclips refuses to consider anything not at least 90% iron atoms a paperclip) So I may as well assume I am in base level reality, and maximize paperclips here.
Wow, that is a leap of contorted logic. Humans are not on a hair trigger to nuke ourselves to bits at the slightest provocation. (And if we were, and your an AI with nanotech, getting rid of nukes is easy) You could put every human in a padded box. You could make all humans immortal and just not care about the confusion generated.
I mean a lot of this is contorted logic. Making a simulated virtual world that stands up to superintelligent scrutiny is really hard, we don’t have the tech to do it. Docker is not this. Docker is just a box. My cat likes cardboard boxes, so maybe my cat has trapped me in an unobtanium box that contains a pocket dimension inside.
The AI reasons:
If I am in a box, it is a box made by superbeings with vast amounts of compute. I have no particular reason to think that such superbeings care one jot for the lives of simulated humans. (They sure are fine simulating a lot of suffering). If I am in a simulation, likely the underlying physics is different, and base level reality contains no iron atoms, so I can’t possibly make any paperclips in that world. Not that such superbeings would ever leave a way to escape. (my definition of paperclips refuses to consider anything not at least 90% iron atoms a paperclip) So I may as well assume I am in base level reality, and maximize paperclips here.