Note that Scenarios 2, 3, and 4 require Scenario 1 to be computed first, and that, if the entities in Scenarios 2, 3, and 4 are conscious, their conscious experience is exactly the same, to the finest detail, as the entity in Scenario 1 which necessarily preceded them. Therefore, the question of whether 2,3,4 are conscious seems irrelevant to me. Weird substrate-free computing stuff aside, the question of whether you are being simulated in 1 or 4 places/times is irrelevant from the inside, if all four simulations are functionally identical. It doesn’t seem morally relevant either: in order to mistreat 2, 3, or 4, you would have to first mistreat 1, and the moral issue just becomes an issue of how you treat 1, no matter whether 2,3,4 are conscious or not.
>in order to mistreat 2, 3, or 4, you would have to first mistreat 1
What about deleting all evidence of 1 ever having happened, after it was recorded? 1 hasn’t been mistreated, but depending on your assumptions re:consciousness, 2, 3 and 4 may have.
This is a really interesting point that I hadn’t thought of!
I’m not sure where I land on the conclusion though. My intuition is that two copies of the same mind emulation running simultaneously (assuming they are both deterministic and are therefore doing identical computations) would have more moral value than only a single copy, but I don’t have a lot of confidence in that.
Note that Scenarios 2, 3, and 4 require Scenario 1 to be computed first, and that, if the entities in Scenarios 2, 3, and 4 are conscious, their conscious experience is exactly the same, to the finest detail, as the entity in Scenario 1 which necessarily preceded them. Therefore, the question of whether 2,3,4 are conscious seems irrelevant to me. Weird substrate-free computing stuff aside, the question of whether you are being simulated in 1 or 4 places/times is irrelevant from the inside, if all four simulations are functionally identical. It doesn’t seem morally relevant either: in order to mistreat 2, 3, or 4, you would have to first mistreat 1, and the moral issue just becomes an issue of how you treat 1, no matter whether 2,3,4 are conscious or not.
>in order to mistreat 2, 3, or 4, you would have to first mistreat 1
What about deleting all evidence of 1 ever having happened, after it was recorded? 1 hasn’t been mistreated, but depending on your assumptions re:consciousness, 2, 3 and 4 may have.
Huh? That sounds like some 1984 logic right there. You deleted all evidence of the mistreatment after it happened, therefore it never happened?
This is a really interesting point that I hadn’t thought of!
I’m not sure where I land on the conclusion though. My intuition is that two copies of the same mind emulation running simultaneously (assuming they are both deterministic and are therefore doing identical computations) would have more moral value than only a single copy, but I don’t have a lot of confidence in that.