Suppose I’m duplicated, and both copies are told that one of us will be thrown off a cliff. While it makes some kind of sense for Copy 1 to be indifferent (or nearly indifferent) to whether he or Copy 2 gets tossed, that’s not what would actually occur. Copy 1 would probably prefer that Copy 2 gets tossed (as a first-order thing; Copy 1′s morals might well tell him that if he can affect the choice, he ought to prefer getting tossed to seeing Copy 2 getting tossed; but in any case we’re far from mere indifference).
There’s something to “concern for my future experience” that is distinct from concern for experiences of beings very like me.
I have the same instincts, and I would have a very hard time overriding them, were my copy and I put in the situation you described, but those instincts are wrong.
These instincts are only maladapted for situations found in very contrived thought experiments. For example, you have to assume that Copy 1 can inspect Copy 2′s source code. Otherwise she could be tricked into believing that she has an identical copy. (What a stupid way to die.) I think our intuitions are already failing us when we try to imagine such source code inspections. (To put it another way: we have very few in common with agents that can do such things.)
For example, you have to assume that Copy 1 can inspect Copy 2′s source code.
It would suffice, instead, to have strong evidence that the copying process is trustworthy; in the limit as the evidence approaches certainty, the more adaptive instinct would approach indifference between the cases.
good thought experiment, but I actually would be indifferent, as long as I actually believed that my copy was genuine, and wouldn’t be thrown off a cliff. Unfortunately I can’t actually imagine any evidence that would convince me of this. I wonder if that’s the source of your reservations too—if the reason you imagine Copy 1 caring is because you can’t imagine Copy 1 being convinced of the scenario.
Suppose I’m duplicated, and both copies are told that one of us will be thrown off a cliff. While it makes some kind of sense for Copy 1 to be indifferent (or nearly indifferent) to whether he or Copy 2 gets tossed, that’s not what would actually occur. Copy 1 would probably prefer that Copy 2 gets tossed (as a first-order thing; Copy 1′s morals might well tell him that if he can affect the choice, he ought to prefer getting tossed to seeing Copy 2 getting tossed; but in any case we’re far from mere indifference).
There’s something to “concern for my future experience” that is distinct from concern for experiences of beings very like me.
I have the same instincts, and I would have a very hard time overriding them, were my copy and I put in the situation you described, but those instincts are wrong.
Emend “wrong” to “maladapted for the situation” and I’ll agree.
These instincts are only maladapted for situations found in very contrived thought experiments. For example, you have to assume that Copy 1 can inspect Copy 2′s source code. Otherwise she could be tricked into believing that she has an identical copy. (What a stupid way to die.) I think our intuitions are already failing us when we try to imagine such source code inspections. (To put it another way: we have very few in common with agents that can do such things.)
It would suffice, instead, to have strong evidence that the copying process is trustworthy; in the limit as the evidence approaches certainty, the more adaptive instinct would approach indifference between the cases.
good thought experiment, but I actually would be indifferent, as long as I actually believed that my copy was genuine, and wouldn’t be thrown off a cliff. Unfortunately I can’t actually imagine any evidence that would convince me of this. I wonder if that’s the source of your reservations too—if the reason you imagine Copy 1 caring is because you can’t imagine Copy 1 being convinced of the scenario.