Did you see my discussion of semi-clones above (ctrl-f should find it for you)? Do you believe that you necessarily care about them? You might also want to look at the True Prisoner’s Dilemma to better understand the intuition behind not co-operating.
Yes, and I’m not sure it helps—I care about everyone, at least a little bit. I seem to care about people closer to me more than people distant, but I think I’d agree to be pelted with eggs to prevent a million tortured people from existing. There is some number less than a million where my intuition flips, and I suspect that I’m inconsistent and dutch-book-able in the details (although log utility on number of other-tortures-prevented might save me).
I don’t know a good way to construct the conundrum you’re trying to: where I don’t care about the copies except for the one which is me. I kind of think identity doesn’t work that way—a perfect copy _is_ me, an imperfect copy is imperfectly-me. I am you, if I had your genes and experiences rather than mine.
EDIT: epistemic status for this theory of identity—speculative. There is something wrong with the naive intuition that “me” is necessarily a singular thread through time, and that things like sleeping and significant model updates (and perhaps every update) don’t create any drift in caring for some situations more than others.
I think so. Caring about a future being (“self”) only in the case where there’s physical continuity (imperfect as it is) and excluding continuity through copying is wrong. The distinction between (imperfect) continuity and (partial) similarity likewise seems broken.
Did you see my discussion of semi-clones above (ctrl-f should find it for you)? Do you believe that you necessarily care about them? You might also want to look at the True Prisoner’s Dilemma to better understand the intuition behind not co-operating.
Yes, and I’m not sure it helps—I care about everyone, at least a little bit. I seem to care about people closer to me more than people distant, but I think I’d agree to be pelted with eggs to prevent a million tortured people from existing. There is some number less than a million where my intuition flips, and I suspect that I’m inconsistent and dutch-book-able in the details (although log utility on number of other-tortures-prevented might save me).
I don’t know a good way to construct the conundrum you’re trying to: where I don’t care about the copies except for the one which is me. I kind of think identity doesn’t work that way—a perfect copy _is_ me, an imperfect copy is imperfectly-me. I am you, if I had your genes and experiences rather than mine.
EDIT: epistemic status for this theory of identity—speculative. There is something wrong with the naive intuition that “me” is necessarily a singular thread through time, and that things like sleeping and significant model updates (and perhaps every update) don’t create any drift in caring for some situations more than others.
Sure that is what you prefer, but is a selfish agent incoherent?
I think so. Caring about a future being (“self”) only in the case where there’s physical continuity (imperfect as it is) and excluding continuity through copying is wrong. The distinction between (imperfect) continuity and (partial) similarity likewise seems broken.