Thought experiment. Imagine a machine that can create an identical set of atoms to the atoms that comprise a human’s body. This machine is used to create a copy of you, and a copy of a second person, whom you have never met and know nothing about.
After the creation of the copy, ‘you’ will have no interaction with it. In fact, it’s going to be placed into a space ship and fired into outer space, as is the copy of Person 2. Unfortunately, one spaceship is going to be very painful to be in. The other is going to be very pleasant. So a copy of you will experience pain or pleasure, and a copy of someone else will experience the other sensation.
To what extent do you care which copy receives which treatment? Zero? As much as you would care if it was you who was to be placed into the spaceship? Or something in between?
Quite a lot, but mostly out of fear of being wrong; I believe that identity is such that I will not wake up in the spaceship, and given that I’m correct I would care very little about the fate of that copy of me, but my beliefs about identity are based on strong intuitions that aren’t soundly philosophically based and merely seem “obvious”. I’d guess I care about 5% as much as I would care if I were to be placed in the spaceship, although “5% as much as if I was to be put in a torture-can and shot into space” is really abstract and I have no idea how to convert that to actual amounts of caring.
Pre copying I would care greatly. Post copying I would mourn my body doubles suffering or celebrate their joy mildly.
From the future copy’s and my experience we were once the same, so the present is invested in the well being of both. Post copy we have split and care for each other to the extent that kin and kind care for each other.
For example consider a lottery you have a 50% chance to win, before the draw you are greatly invested in the outcome. after the outcome you barely give a thought to the alternate timeline you that could have won.
Pre copying I would care greatly. Post copying I would mourn my body doubles suffering or celebrate their joy mildly.
I think this turns you into a money pump. Pre-split there’s some amount of money you will pay to have it be the other person experiencing pain rather than your double. Post-split you’ll need less money given back to you to incentivise you to let it be your double rather than the other person.
I’m going to need to go back and brush up on the money pump concept. But for now I’m on boat that that would be mugging my body double as drethelin said.
Wadavis v1.0 cares about all future versions of Wadavis. He accepts the deal and improves the life of the body double Wadavis v2.0. Wadavis v1.1 is the planetside post-copy of Wadavis v1.0, he accepts the second deal and reduces the quality of life of Wadavis v2.0. It is clear payout with no downside.
Wadavis v1.1 is a jerk who denied Wadavis v2.0, who remember includes Wadavis v1.0 in their identity, agency over their own future. Wadavis v1.1 just mugged Wadavis v2.0 for the money Wadavis v1.0 paid for the better life.
Now if Wadavis v1.0 was rational and cared for all future Wadavis versions. Would he cooperate (pay) if he knew Wadavis v1.1 would defect (take the second of option)? No, that would be foolish. So Wadavis v0.0 has precommitted to respect the rights and freedom of(cooperate with) all versions of Wadavis, eg. Not mug them of a luxury bought and paid for.
So Wadavis v0.0 has precommitted to respect the rights and freedom of(cooperate with) all versions of Wadavis, eg. Not mug them of a luxury bought and paid for.
Okay, that makes sense. So Wadavis v1.1 doesn’t care much about Wadavis v2.0, but he acts like he cares a lot?
Thats right, but I want to double check our connotations. Acts feels like faking or intentional signalling, how about Wadavis v1.1 does not defect against kin and kind (other Wadavis versions in this case) so that future kin and kind will cooperate with him. Less a matter of acting and more a matter of those are the rules Wadavis follows while dealing with Wadavis, Home-team bot Cooperates with all other Home-team bots, even if defect has a higher payoff for the tempted version. Schelling fences and such.
This all hinges on Wadavis v1.0 cooperating and having some sort of confidence that all future versions will cooperate. I think this is where is comes together, Wadavis v1.0 can simulate the behavior of future versions. If future versions cooperate, v1.0 cooperates. if future versions defect, v1.0 will defect and not invest in helping v2.0.
Thought experiment. Imagine a machine that can create an identical set of atoms to the atoms that comprise a human’s body. This machine is used to create a copy of you, and a copy of a second person, whom you have never met and know nothing about.
After the creation of the copy, ‘you’ will have no interaction with it. In fact, it’s going to be placed into a space ship and fired into outer space, as is the copy of Person 2. Unfortunately, one spaceship is going to be very painful to be in. The other is going to be very pleasant. So a copy of you will experience pain or pleasure, and a copy of someone else will experience the other sensation.
To what extent do you care which copy receives which treatment? Zero? As much as you would care if it was you who was to be placed into the spaceship? Or something in between?
I would care just slightly less than I would if my original body was sent instead (so I’d care a lot).
(I put a high probability on something along the lines of pattern identity theory being ‘correct’)
I would care as much as I would about any two random human beings, plus caring points for personal acquaintance.
Quite a lot, but mostly out of fear of being wrong; I believe that identity is such that I will not wake up in the spaceship, and given that I’m correct I would care very little about the fate of that copy of me, but my beliefs about identity are based on strong intuitions that aren’t soundly philosophically based and merely seem “obvious”. I’d guess I care about 5% as much as I would care if I were to be placed in the spaceship, although “5% as much as if I was to be put in a torture-can and shot into space” is really abstract and I have no idea how to convert that to actual amounts of caring.
Pre copying I would care greatly. Post copying I would mourn my body doubles suffering or celebrate their joy mildly.
From the future copy’s and my experience we were once the same, so the present is invested in the well being of both. Post copy we have split and care for each other to the extent that kin and kind care for each other.
For example consider a lottery you have a 50% chance to win, before the draw you are greatly invested in the outcome. after the outcome you barely give a thought to the alternate timeline you that could have won.
I think this turns you into a money pump. Pre-split there’s some amount of money you will pay to have it be the other person experiencing pain rather than your double. Post-split you’ll need less money given back to you to incentivise you to let it be your double rather than the other person.
I’m going to need to go back and brush up on the money pump concept. But for now I’m on boat that that would be mugging my body double as drethelin said.
Wadavis v1.0 cares about all future versions of Wadavis. He accepts the deal and improves the life of the body double Wadavis v2.0. Wadavis v1.1 is the planetside post-copy of Wadavis v1.0, he accepts the second deal and reduces the quality of life of Wadavis v2.0. It is clear payout with no downside.
Wadavis v1.1 is a jerk who denied Wadavis v2.0, who remember includes Wadavis v1.0 in their identity, agency over their own future. Wadavis v1.1 just mugged Wadavis v2.0 for the money Wadavis v1.0 paid for the better life.
Now if Wadavis v1.0 was rational and cared for all future Wadavis versions. Would he cooperate (pay) if he knew Wadavis v1.1 would defect (take the second of option)? No, that would be foolish. So Wadavis v0.0 has precommitted to respect the rights and freedom of(cooperate with) all versions of Wadavis, eg. Not mug them of a luxury bought and paid for.
Make sense?
Okay, that makes sense. So Wadavis v1.1 doesn’t care much about Wadavis v2.0, but he acts like he cares a lot?
Thats right, but I want to double check our connotations. Acts feels like faking or intentional signalling, how about Wadavis v1.1 does not defect against kin and kind (other Wadavis versions in this case) so that future kin and kind will cooperate with him. Less a matter of acting and more a matter of those are the rules Wadavis follows while dealing with Wadavis, Home-team bot Cooperates with all other Home-team bots, even if defect has a higher payoff for the tempted version. Schelling fences and such.
This all hinges on Wadavis v1.0 cooperating and having some sort of confidence that all future versions will cooperate. I think this is where is comes together, Wadavis v1.0 can simulate the behavior of future versions. If future versions cooperate, v1.0 cooperates. if future versions defect, v1.0 will defect and not invest in helping v2.0.
Yep. I didn’t mean “act” as in “perform in a play” but as in “carry out an action”.
That’s not really a money pump, since you have to spend whatever resources it takes to create a bunch of clones and torture them.
The point isn’t whether I (the pumper) make a profit, it’s whether you (the pumpee) make a loss.
I the “money pump” needs to be imposed from the outside via threats it’s no different than mugging.