In a world where you can create whoever you want and fully resurrect people no matter when they died, life ethics will be drastically different from our own. People would kill themselves in spectacular ways just for fun, as today they would practice extreme sports (just to cite Prime Intellect).
If I create a million digital minds and kill them instantly—would you be morally obligated to resurrect all of them?
If these days someone creates children, and then endangers their lives, are we morally obligated to try to save them? I see it as a morally equal situation.
And I am doing a bad thing killing the minds when I know you’re gonna resurrect them anyway?
Trying to kill people is definitely a bad thing, even if you are sure that the murder will be unsuccessful.
There is also no guarantee that any of the listed resurrection methods will ever work.
In a world where you can create whoever you want and fully resurrect people no matter when they died, life ethics will be drastically different from our own. People would kill themselves in spectacular ways just for fun, as today they would practice extreme sports (just to cite Prime Intellect).
As per our assumptions, Archimedes 2.0 is the Archimedes, the 3rd-century BC Greek thinker who was temporary dead but who will be alive again.
(in exactly the same sense, as some of the today’s clinically-dead patients will be alive, thanks to the modern medical tech).
Thus, the plan is not just create some minds similar to Archimedes, but to save the life of Archimedes himself.
I see resurrecting a long-dead person as morally equal to saving a contemporary who is in grave danger.
If these days someone creates children, and then endangers their lives, are we morally obligated to try to save them? I see it as a morally equal situation.
Trying to kill people is definitely a bad thing, even if you are sure that the murder will be unsuccessful.
There is also no guarantee that any of the listed resurrection methods will ever work.