How is that relevant? If I take some action that results in the death of myself in some other Everett branch, then I have killed a human being in the multiverse.
Think about applying your argument to this universe. You shoot someone in the head, they die instantly, and then you say to the judge “well think of it this way: he’s not around to experience this. besides, there’s other worlds where I didn’t shoot him, so he’s not really dead!”
You can’t appeal to common sense. That’s the point of quantum immortality, it defies our common sense notions about death. Obviously, since we are used to assuming single-threaded universe, where death is equivalent to ceasing to exist.
Of course, if you kill someone, you still cause that person pain in the vast majority of universes, as well as grieving to their family and friends.
If star-trek-style teleportation was possible by creating a clone and deleting the original, is that equivalent to suicide/murder/death? If you could upload your mind to a computer but destroy your biological brain, is that suicide, and is the upload really you? Does destroying copies really matter as long as one lives on (assuming the copies don’t suffer)?
You absolutely appeal to common sense on moral issues. Morality is applied common sense, in the Minsky view of “common sense” being an assortment of deductions and inferences extracted from the tangled web of my personal experiential and computational history. Morality is the result of applying that common sense knowledgebase against possible actions in a planning algorithm.
Quantum “immortality” involves a sudden, unexpected, and unjustified redefinition of “death.” That argument works if you buy the premise. But, I don’t.
If you are saying that there is no difference between painlessly, instantaneously killing someone in one branch while letting them live another, verses letting that person live in both, then I don’t know how to proceed. If you’re going to say that then you might as well make yourself indifferent to the arrow of time as well, in which case it doesn’t matter if that person dies in all branches because he still “exists” in history.
Now I no longer know what we are talking about. According to my morality, it is wrong to kill someone. The existence of other branches where that person does not die does not have even epsilon difference on my evaluation of moral choices in this world. The argument from the other side seems inconsistent to me.
And yes, star trek transporters and destructive uploaders are death machines, a position I’ve previouslyarticulated on lesswrong.
You are appealing to a terminal value that I do not share. I think caring about clones is absurd. As long as one copy of me lives, what difference does it make if I create and delete a thousand others? It doesn’t change my experience or theirs. Nothing would change and I wouldn’t even be aware of it.
From my point of view, I do not like the thought that I might be arbitrarily deleted by a clone of myself. I therefore choose to commit to not deleting clones of myself; thus preventing myself from being deleted by any clones that share that commitment.
Well think of it this way. You are dead/non-existent in the vast majority of universes as it is.
How is that relevant? If I take some action that results in the death of myself in some other Everett branch, then I have killed a human being in the multiverse.
Think about applying your argument to this universe. You shoot someone in the head, they die instantly, and then you say to the judge “well think of it this way: he’s not around to experience this. besides, there’s other worlds where I didn’t shoot him, so he’s not really dead!”
You can’t appeal to common sense. That’s the point of quantum immortality, it defies our common sense notions about death. Obviously, since we are used to assuming single-threaded universe, where death is equivalent to ceasing to exist.
Of course, if you kill someone, you still cause that person pain in the vast majority of universes, as well as grieving to their family and friends.
If star-trek-style teleportation was possible by creating a clone and deleting the original, is that equivalent to suicide/murder/death? If you could upload your mind to a computer but destroy your biological brain, is that suicide, and is the upload really you? Does destroying copies really matter as long as one lives on (assuming the copies don’t suffer)?
You absolutely appeal to common sense on moral issues. Morality is applied common sense, in the Minsky view of “common sense” being an assortment of deductions and inferences extracted from the tangled web of my personal experiential and computational history. Morality is the result of applying that common sense knowledgebase against possible actions in a planning algorithm.
Quantum “immortality” involves a sudden, unexpected, and unjustified redefinition of “death.” That argument works if you buy the premise. But, I don’t.
If you are saying that there is no difference between painlessly, instantaneously killing someone in one branch while letting them live another, verses letting that person live in both, then I don’t know how to proceed. If you’re going to say that then you might as well make yourself indifferent to the arrow of time as well, in which case it doesn’t matter if that person dies in all branches because he still “exists” in history.
Now I no longer know what we are talking about. According to my morality, it is wrong to kill someone. The existence of other branches where that person does not die does not have even epsilon difference on my evaluation of moral choices in this world. The argument from the other side seems inconsistent to me.
And yes, star trek transporters and destructive uploaders are death machines, a position I’ve previously articulated on lesswrong.
You are appealing to a terminal value that I do not share. I think caring about clones is absurd. As long as one copy of me lives, what difference does it make if I create and delete a thousand others? It doesn’t change my experience or theirs. Nothing would change and I wouldn’t even be aware of it.
From my point of view, I do not like the thought that I might be arbitrarily deleted by a clone of myself. I therefore choose to commit to not deleting clones of myself; thus preventing myself from being deleted by any clones that share that commitment.