I think what I’m trying to get at is the question of whether a brain that is identical to yours is still you, even if it was created by brute force guessing rather than by studying and copying the first instance of your brain. (And also the equivalent done with uploads or simulated brains—I don’t think digital vs. biological should matter.)
This topic of what is “really you” recently came up on another thread as well; my position remains that whether some particular future entity is “really me” is a judgment each judge makes based on what that judge values most about me, and there simply is no fact of the matter. Some people who knew me at 20 might not believe that I’m still the same person, for example, simply due to the changes wrought by age and experience, and they aren’t wrong, they simply care about different things than I do.
The same goes for a future entity that shares my memories etc. coincidentally rather than causally (as in your example).
Me personally? I’m pretty liberal about identity; I’m happy to treat it as being preserved through this sort of noncausal link. That said, if we really do create all possible human minds (which would indeed require inconceivable resources), there would be a vast number of future minds I would consider to preserve my identity.
How about you? What has to be true of a future mind for you to treat it as a preservation of your identity?
I think what I’m trying to get at is the question of whether a brain that is identical to yours is still you, even if it was created by brute force guessing rather than by studying and copying the first instance of your brain. (And also the equivalent done with uploads or simulated brains—I don’t think digital vs. biological should matter.)
Ah, I see.
This topic of what is “really you” recently came up on another thread as well; my position remains that whether some particular future entity is “really me” is a judgment each judge makes based on what that judge values most about me, and there simply is no fact of the matter. Some people who knew me at 20 might not believe that I’m still the same person, for example, simply due to the changes wrought by age and experience, and they aren’t wrong, they simply care about different things than I do.
The same goes for a future entity that shares my memories etc. coincidentally rather than causally (as in your example).
Me personally? I’m pretty liberal about identity; I’m happy to treat it as being preserved through this sort of noncausal link. That said, if we really do create all possible human minds (which would indeed require inconceivable resources), there would be a vast number of future minds I would consider to preserve my identity.
How about you? What has to be true of a future mind for you to treat it as a preservation of your identity?