Consider that the other people in the world are either new (they don’t exist elsewhere) or nonsentient if that bothers you. In the case of the other people being new, the copies would have to diverge. But consider (as I said in another comment) the case where the environment controls the divergence to not be that axiologically significant, i.e. none of the copies end up “messed up”.
That’s just stipulated.
But the stipulation as stated leads to major problems—for instance:
implies that I’m copying the entire world full of people, not just me. That distorts the incentives.
Edit: And it also implies that the copy will not be useful for backup, as whatever takes me out is likely to take it out.
For the moment, consider the case where the environment that each copy is in is benign, so there is no need for backup.
I’m just trying to gauge the terminal value of extra, non-interacting copies.
Consider that the other people in the world are either new (they don’t exist elsewhere) or nonsentient if that bothers you. In the case of the other people being new, the copies would have to diverge. But consider (as I said in another comment) the case where the environment controls the divergence to not be that axiologically significant, i.e. none of the copies end up “messed up”.