I agree with TheOtherDave. If you imagine that we scan someone’s brain and then run one-thousand simulations of them walking around the same environment, all having exactly the same experiences, it doesn’t matter if we turn one of those simulations off. Nobody’s died. What I’m saying is that the person is the mental states, and what it means for two people to be different people is that they have different mental states.
I’m not really sure about the morality of punishing them both for the crimes of one of them, though. On one hand, the one who didn’t do it isn’t the same person as the one who did—they didn’t actually experience committing the murder or whatever. On the other hand, they’re also someone who would have done it in the same circumstances—so they’re dangerous. I don’t know.
it doesn’t matter if we turn one of those simulations off. Nobody’s died.
You are decreasing the amount of that person that exists.
Suppose the multiple words interpretation is true. Now I flip a fair quantum coin, and kill you if it comes up heads. Then in 50% of the worlds you still live, so by your reasoning, nobody has died. All that changes is the amplitude of your existence.
Suppose the multiple words interpretation is true. Now I flip a fair quantum coin, and kill you if it comes up heads. Then in 50% of the worlds you still live, so by your reasoning, nobody has died. All that changes is the amplitude of your existence.
Well, maybe. But there is a whole universe full of people who will never speak to you again and are left to grieve over your body.
You are decreasing the amount of that person that exists.
Yes, there is a measure of that person’s existence (number of perfect copies) which I’m reducing by deleting a perfect copy of that person. What I’m saying is precisely that I don’t care, because that is not a measure of people I value.
Similarly, if I gain 10 pounds, there’s a measure of my existence (mass) which I thereby increase. I don’t care, because that’s not a measure of people I value.
Neither of those statements is quite true, admittedly. For example, I care about gaining 10 pounds because of knock-on effects—health, vanity, comfort, etc. I care about gaining an identical backup because of knock-on effects—reduced risk of my total destruction, for example. Similarly, I care about gaining a million dollars, I care about gaining the ability to fly, there’s all kinds of things that I care about. But I assume that your point here is not that identical copies are valuable in some sense, but that they are valuable in some special sense, and I just don’t see it.
As far as MWI goes, yes… if you posit a version of many-worlds where the various branches are identical, then I don’t care if you delete half of those identical branches. I do care if you delete me from half of them, because that causes my loved ones in those branches to suffer… or half-suffer, if you like. Also, because the fact that those branches have suddenly become non-identical (since I’m in some and not the others) makes me question the premise that they are identical branches.
You are decreasing the amount of that person that exists.
And this “amount” is measured by the number of simulations? What if one simulation is using double the amount of atoms (e.g. by having thicker transistors), does it count twice as much? What if one simulation double checks each result, and another does not, does it count as two?
All that changes is the amplitude of your existence.
The equivalence between copies spreads across the many-worlds and identical simulations running in the same world, is yet to be proven or disproven—and I expect it won’t be proven or disproven until we have some better understanding about the hard problem of consciousness.
I agree with TheOtherDave. If you imagine that we scan someone’s brain and then run one-thousand simulations of them walking around the same environment, all having exactly the same experiences, it doesn’t matter if we turn one of those simulations off. Nobody’s died. What I’m saying is that the person is the mental states, and what it means for two people to be different people is that they have different mental states. I’m not really sure about the morality of punishing them both for the crimes of one of them, though. On one hand, the one who didn’t do it isn’t the same person as the one who did—they didn’t actually experience committing the murder or whatever. On the other hand, they’re also someone who would have done it in the same circumstances—so they’re dangerous. I don’t know.
You are decreasing the amount of that person that exists.
Suppose the multiple words interpretation is true. Now I flip a fair quantum coin, and kill you if it comes up heads. Then in 50% of the worlds you still live, so by your reasoning, nobody has died. All that changes is the amplitude of your existence.
Well, maybe. But there is a whole universe full of people who will never speak to you again and are left to grieve over your body.
Good point.
There is of course a difference between death and non-existence.
Yes, there is a measure of that person’s existence (number of perfect copies) which I’m reducing by deleting a perfect copy of that person. What I’m saying is precisely that I don’t care, because that is not a measure of people I value.
Similarly, if I gain 10 pounds, there’s a measure of my existence (mass) which I thereby increase. I don’t care, because that’s not a measure of people I value.
Neither of those statements is quite true, admittedly. For example, I care about gaining 10 pounds because of knock-on effects—health, vanity, comfort, etc. I care about gaining an identical backup because of knock-on effects—reduced risk of my total destruction, for example. Similarly, I care about gaining a million dollars, I care about gaining the ability to fly, there’s all kinds of things that I care about. But I assume that your point here is not that identical copies are valuable in some sense, but that they are valuable in some special sense, and I just don’t see it.
As far as MWI goes, yes… if you posit a version of many-worlds where the various branches are identical, then I don’t care if you delete half of those identical branches. I do care if you delete me from half of them, because that causes my loved ones in those branches to suffer… or half-suffer, if you like. Also, because the fact that those branches have suddenly become non-identical (since I’m in some and not the others) makes me question the premise that they are identical branches.
And this “amount” is measured by the number of simulations? What if one simulation is using double the amount of atoms (e.g. by having thicker transistors), does it count twice as much? What if one simulation double checks each result, and another does not, does it count as two?
The equivalence between copies spreads across the many-worlds and identical simulations running in the same world, is yet to be proven or disproven—and I expect it won’t be proven or disproven until we have some better understanding about the hard problem of consciousness.