What you seem to want to say here is that because murders in other MW branches are “actual”, you care about them, but since murders in my imagination are not “actual”, you don’t.
Right, exactly. I’m taking this sense of ‘actual’ (not literally) from the sequences. This is from ‘On being Decoherent’:
You only see nearby objects, not objects light-years away, because photons from those objects can’t reach you, therefore you can’t see them. By a similar locality principle, you don’t interact with distant configurations.
Later on in this post EY says that the Big World is already at issue in spatial terms: somewhere far away, there is another Esar (or someone enough like me to count as me). The implication is that existing in another world is analogous to existing in another place. And I certainly don’t think I’m allowed to apply the ‘keep your own corner clean’ principle to spatial zones.
In ’Living in Many Worlds”, EY says:
“Oh, there are a few implications of many-worlds for ethics. Average utilitarianism suddenly looks a lot more attractive—you don’t need to worry about creating as many people as possible, because there are already plenty of people exploring person-space. You just want the average quality of life to be as high as possible, in the future worlds that are your responsibility.
And you should always take joy in discovery, as long as you personally don’t know a thing. It is meaningless to talk of being the “first” or the “only” person to know a thing, when everything knowable is known within worlds that are in neither your past nor your future, and are neither before or after you.”
I take him to mean that there are really, actually many other people who exist (just in different worlds) and that I’m responsible for the quality of life for some sub-set of those people. And that there really are, actually, many people in other worlds who have discovered or know things I might take myself to have discovered or be the first to know. Such that it’s a small but real overturning of normality that I can’t really be the first to know something. (That, I assume, is what an ethical implication of MW for ethics amounts to, some overturning of some ethical normality).
I’m happy to say that those are “actual” patterns of neural activation. I would not say that they are “actual” murdered human beings.
If you modeled it to the point that you fully modeled a human being in your brain, and then murdered them, it seems obvious that you did actually kill someone. Hypothetical murders (but considered) fail to be murders because they fail to be good enough models.
Ordinarily, I would describe someone who is uncertain about obvious things as a fool. It’s not clear to me that I’m a fool, but it is also not at all clear to me that murder as you’ve defined it in this conversation is evil.
If you could explain that obvious truth to me, I might learn something.
Ordinarily, I would describe someone who is uncertain about obvious things as a fool. It’s not clear to me that I’m a fool, but it is also not at all clear to me that murder as you’ve defined it in this conversation is evil.
I didn’t mean to call you a fool, only I don’t think the disruption of your intuitions is a disruption of your ethical intuitions. It’s unintuitive to think of a human-being as something fully emulated within another human being’s brain, but if this is actually possible, it’s not unintuitive that ending this neural activity would be murder (if it weren’t some other form of killing-a-human-being). My point was just that the distinction in hardware can’t make a difference to the question of whether or not ending a neural activity is killing, and given a set of constants, murder.
Since I don’t think we’re any longer talking about my original question, I think I’ll tap out.
Right, exactly. I’m taking this sense of ‘actual’ (not literally) from the sequences. This is from ‘On being Decoherent’:
Later on in this post EY says that the Big World is already at issue in spatial terms: somewhere far away, there is another Esar (or someone enough like me to count as me). The implication is that existing in another world is analogous to existing in another place. And I certainly don’t think I’m allowed to apply the ‘keep your own corner clean’ principle to spatial zones.
In ’Living in Many Worlds”, EY says:
I take him to mean that there are really, actually many other people who exist (just in different worlds) and that I’m responsible for the quality of life for some sub-set of those people. And that there really are, actually, many people in other worlds who have discovered or know things I might take myself to have discovered or be the first to know. Such that it’s a small but real overturning of normality that I can’t really be the first to know something. (That, I assume, is what an ethical implication of MW for ethics amounts to, some overturning of some ethical normality).
If you modeled it to the point that you fully modeled a human being in your brain, and then murdered them, it seems obvious that you did actually kill someone. Hypothetical murders (but considered) fail to be murders because they fail to be good enough models.
Yes...obviously!
Ordinarily, I would describe someone who is uncertain about obvious things as a fool. It’s not clear to me that I’m a fool, but it is also not at all clear to me that murder as you’ve defined it in this conversation is evil.
If you could explain that obvious truth to me, I might learn something.
I didn’t mean to call you a fool, only I don’t think the disruption of your intuitions is a disruption of your ethical intuitions. It’s unintuitive to think of a human-being as something fully emulated within another human being’s brain, but if this is actually possible, it’s not unintuitive that ending this neural activity would be murder (if it weren’t some other form of killing-a-human-being). My point was just that the distinction in hardware can’t make a difference to the question of whether or not ending a neural activity is killing, and given a set of constants, murder.
Since I don’t think we’re any longer talking about my original question, I think I’ll tap out.