no. copies are all equally me until they diverge greatly; I wouldn’t mind 10 copies existing for 10 minutes and then being deleted any more than I would mind forgetting an hour. the “primary copy” is maybe a bad way to put it; I only meant that colloquially, in the sense that looking at that world from the outside, the structure is obvious.
copy on the satellite would not disagree
yes would have the right, but as an FDT agent a copy would not disagree except for straight up noise in the implementation of me; I might make a mistake if I can’t propagate information between all parts of myself but that’s different
that sounds kind of disgusting to experience as the remaining agent, but I don’t see an obvious reason it should be a moral thing. if you’re the kind of agent that would do that, I might avoid you
copies are not property, they’re equal
that’s very complicated based on what the crime is and the intent of the punishment/retribution/restorative justice/etc
I read this as assuming that all copies deterministically demonstrate absolute allegiance to the collective self. I question that assertion, but have no clear way of proving the argument one way or another. If ‘re-merging’ is possible, mergeable copies intending to merge should probably be treated as a unitary entity rather than individuals for the sake of this discussion.
Ultimately, I read your position as stating that suicide is a human right, but that secure deletion of an individual is not acceptable to prevent ultimate harm to that individual, but is acceptable to prevent harm caused by that individual to others.
This is far from a settled issue, and has analogy in the question ‘should you terminate an uncomplicated preganancy with terminal birth defects?’ Anencephaly is a good example of this situation. The argument presented in the OP is consistent with a ‘yes’, and I read your line of argument as consistent with a clear ‘no’.
I acausally cooperate with agents who I evaluate to be similar to me. That includes most humans, but it includes myself REALLY HARD, and doesn’t include an unborn baby. (because babies are just templates, and the thing that makes them like me is being in the world for a year ish.)
Is your position consistent with effective altruism?
The trap expressed in the OP is essentially a statement that approaching a particular problem involving uploaded consciousness using the framework of effective altruism to drive decision-making led to a perverse (brains in blenders!) incentive. The options at this point are a) the perverse act is not perverse b) effective altruism does not lead to that perverse act c) effective altruism is flawed, try something else (like ‘ideological kin’ selection?)
You are unequivocal about your disinterest in being on the receiving of this brand of altruism, and have also asserted that you cooperate acausally with agents similar to you, (based on degree of similarity?) and previously asserted that an agent who shares the sum total of your life experience, less the most recent year, can be cast aside and destroyed without thought or consequence. So...do I mark you down for option c?
no. copies are all equally me until they diverge greatly; I wouldn’t mind 10 copies existing for 10 minutes and then being deleted any more than I would mind forgetting an hour. the “primary copy” is maybe a bad way to put it; I only meant that colloquially, in the sense that looking at that world from the outside, the structure is obvious.
copy on the satellite would not disagree
yes would have the right, but as an FDT agent a copy would not disagree except for straight up noise in the implementation of me; I might make a mistake if I can’t propagate information between all parts of myself but that’s different
that sounds kind of disgusting to experience as the remaining agent, but I don’t see an obvious reason it should be a moral thing. if you’re the kind of agent that would do that, I might avoid you
copies are not property, they’re equal
that’s very complicated based on what the crime is and the intent of the punishment/retribution/restorative justice/etc
I read this as assuming that all copies deterministically demonstrate absolute allegiance to the collective self. I question that assertion, but have no clear way of proving the argument one way or another. If ‘re-merging’ is possible, mergeable copies intending to merge should probably be treated as a unitary entity rather than individuals for the sake of this discussion.
Ultimately, I read your position as stating that suicide is a human right, but that secure deletion of an individual is not acceptable to prevent ultimate harm to that individual, but is acceptable to prevent harm caused by that individual to others.
This is far from a settled issue, and has analogy in the question ‘should you terminate an uncomplicated preganancy with terminal birth defects?’ Anencephaly is a good example of this situation. The argument presented in the OP is consistent with a ‘yes’, and I read your line of argument as consistent with a clear ‘no’.
Thanks again for the food for thought.
I acausally cooperate with agents who I evaluate to be similar to me. That includes most humans, but it includes myself REALLY HARD, and doesn’t include an unborn baby. (because babies are just templates, and the thing that makes them like me is being in the world for a year ish.)
Is your position consistent with effective altruism?
The trap expressed in the OP is essentially a statement that approaching a particular problem involving uploaded consciousness using the framework of effective altruism to drive decision-making led to a perverse (brains in blenders!) incentive. The options at this point are a) the perverse act is not perverse b) effective altruism does not lead to that perverse act c) effective altruism is flawed, try something else (like ‘ideological kin’ selection?)
You are unequivocal about your disinterest in being on the receiving of this brand of altruism, and have also asserted that you cooperate acausally with agents similar to you, (based on degree of similarity?) and previously asserted that an agent who shares the sum total of your life experience, less the most recent year, can be cast aside and destroyed without thought or consequence. So...do I mark you down for option c?