A person who cares about bending spacetime lots is not equivalent to a person who cares about doing things isomorphic to bending spacetime lots. One will refuse to be replaced by a simulation, and the other will welcome it. One will try to make big compressed piles of things, and the other will daydream about making unfathomably big compressed piles of things.
Telling a person who cares about bending spacetime lots that, within the simulation, they’ll think they’re bending spacetime lots will not motivate them. They don’t care about thinking they’re bending spacetime. They want to actually bend spacetime. P wants X, not S(X), even though S(P) S(wants) S(X).
If isomorphism is enough then the person who cares about bending spacetime a lot, who wants X but not S(X), is somehow fundamentally misguided. A case I can think of where that would be the case is a simulated world where simulated simulations are unwrapped (and hopefully placed within observable distance, so P can find out X = S(X) and react accordingly). In other cases.… well, at the moment, I just don’t see how it’s misguided to be P wanting X but not care about S(P) S(wanting) S(X).
I don’t want to think I’m conscious. I don’t want the effects of what I would do if I were conscious to be computed out in exacting detail. I don’t want people to tell stories about times I was conscious. I want to be conscious. On the other hand, I suppose that’s what most non-simulated evolved things would say...
Is isomorphism enough?
Consider gravity as an analogy.
A person who cares about bending spacetime lots is not equivalent to a person who cares about doing things isomorphic to bending spacetime lots. One will refuse to be replaced by a simulation, and the other will welcome it. One will try to make big compressed piles of things, and the other will daydream about making unfathomably big compressed piles of things.
Telling a person who cares about bending spacetime lots that, within the simulation, they’ll think they’re bending spacetime lots will not motivate them. They don’t care about thinking they’re bending spacetime. They want to actually bend spacetime. P wants X, not S(X), even though S(P) S(wants) S(X).
If isomorphism is enough then the person who cares about bending spacetime a lot, who wants X but not S(X), is somehow fundamentally misguided. A case I can think of where that would be the case is a simulated world where simulated simulations are unwrapped (and hopefully placed within observable distance, so P can find out X = S(X) and react accordingly). In other cases.… well, at the moment, I just don’t see how it’s misguided to be P wanting X but not care about S(P) S(wanting) S(X).
I don’t want to think I’m conscious. I don’t want the effects of what I would do if I were conscious to be computed out in exacting detail. I don’t want people to tell stories about times I was conscious. I want to be conscious. On the other hand, I suppose that’s what most non-simulated evolved things would say...