Hmm, let’s say we don’t count identical copies as having moral weight, and we assume that Many Worlds is correct.
In that case, I build a device that will utterly annihilate the Earth with a 50⁄50 probability based on a single quantum event, Schrodinger’s Really Big Nuke. That event happens/doesn’t happen, branching two universes identical except for that single event by MWI, one of which has an Earth immediately annihilated by the device, the other surviving.
By the not-counting-duplicates theory, the moral weight of the annihilation of an entire planet of seven billion thinking beings is zero, because they were all duplicated by the quantum event that caused their destruction.
I think the same quantum event would have to happen without the bomb; without the bomb would just the same lead to forking as the influences propagate through interactions between atoms, etc. (and the forking in either case will be into huge number of observers and continue forever, except with a bomb half the observers will be dead). It seems to me you’ll have to violate conservation laws to actually create more copies for you to morally-neutrally destroy.
Anyways, how would you count AI running on a computer that got 2x wire crossection, 2x larger surface capacitors, 2x the current, etc? (versus other computer). What if I add non-functional dielectric, splitting along each wire, and each transistor, in two, resulting in 2 computers (that are almost superimposed on each other)? Why should it change the count?
I’m currently trying to avoid having opinions on this whole subject. I kept thinking it all around in circles; I’m now letting my back-brain see if it can come up with any insights. But yours is one of the ideas that passed my mind.
There’s an interesting interaction of “identical copies don’t mean anything” with one of the problem-of-identity solutions you see around this site, which is that you should treat copies and simulations of yourself as yourself, indeed in proportion to how closely they resemble you. If an identical- or near-copy of me has moral weight when I’m trying to decide whether to one-box, or defect in the Prisoner’s Dilemma, or the like, it would seem to have to have the same weight in questions like this one, or vice-versa.
Hmm, let’s say we don’t count identical copies as having moral weight, and we assume that Many Worlds is correct.
In that case, I build a device that will utterly annihilate the Earth with a 50⁄50 probability based on a single quantum event, Schrodinger’s Really Big Nuke. That event happens/doesn’t happen, branching two universes identical except for that single event by MWI, one of which has an Earth immediately annihilated by the device, the other surviving.
By the not-counting-duplicates theory, the moral weight of the annihilation of an entire planet of seven billion thinking beings is zero, because they were all duplicated by the quantum event that caused their destruction.
I think the same quantum event would have to happen without the bomb; without the bomb would just the same lead to forking as the influences propagate through interactions between atoms, etc. (and the forking in either case will be into huge number of observers and continue forever, except with a bomb half the observers will be dead). It seems to me you’ll have to violate conservation laws to actually create more copies for you to morally-neutrally destroy.
Anyways, how would you count AI running on a computer that got 2x wire crossection, 2x larger surface capacitors, 2x the current, etc? (versus other computer). What if I add non-functional dielectric, splitting along each wire, and each transistor, in two, resulting in 2 computers (that are almost superimposed on each other)? Why should it change the count?
I would be curious what you think of my comment elsewhere in this thread: http://lesswrong.com/r/discussion/lw/9xw/33_holes_and_1031031_pigeons_or_vice_versa/5vcq
I’m currently trying to avoid having opinions on this whole subject. I kept thinking it all around in circles; I’m now letting my back-brain see if it can come up with any insights. But yours is one of the ideas that passed my mind.
There’s an interesting interaction of “identical copies don’t mean anything” with one of the problem-of-identity solutions you see around this site, which is that you should treat copies and simulations of yourself as yourself, indeed in proportion to how closely they resemble you. If an identical- or near-copy of me has moral weight when I’m trying to decide whether to one-box, or defect in the Prisoner’s Dilemma, or the like, it would seem to have to have the same weight in questions like this one, or vice-versa.
Agreed.
But don’t avoid opinions, you can form some and always preface them with caveats to get a sword out of that iron.