Well, a thing that acts like us in one particular situation (say, a thing that types “I’m conscious” in chat) clearly doesn’t always have our qualia. Maybe you could say that a thing that acts like us in all possible situations must have our qualia?
Right, that’s what I meant.
This is philosophically interesting!
Thank you!
It makes a factual question (does the thing have qualia right now?) logically depend on a huge bundle of counterfactuals, most of which might never be realized.
The I/O behavior being the same is a sufficient condition for it to be our mind upload. A sufficient condition for it to have some qualia, as opposed for it to have our mind and our qualia, will be weaker.
What if, during uploading, we insert a bug that changes our behavior in one of these counterfactuals
Then it’s, to a very slight extent, another person (with the continuum between me and another person being gradual).
but then the upload never actually runs into that situation in the course of its life—does the upload still have the same qualia as the original person, in situations that do get realized?
Then the qualia would be very slightly different, unless I’m missing something. (To bootstrap the intuition, I would expect my self that chooses vanilla ice-cream over chocolate icecream in one specific situation to have very slightly different feelings and preferences in general, resulting in very slightly different qualia, even if he never encounters that situation.) With many such bugs, it would be the same, but to a greater extent.
If there’s a thought that you sometimes think, but it doesn’t influence your I/O behavior, it can get optimized away
I don’t think such thoughts exist (I can always be asked to say out loud what I’m thinking). Generally, I would say that a thought that never, even in principle, influences my output, isn’t possible. (The same principle should apply to trying to replace a thought just by a few bits.)
Right, that’s what I meant.
Thank you!
The I/O behavior being the same is a sufficient condition for it to be our mind upload. A sufficient condition for it to have some qualia, as opposed for it to have our mind and our qualia, will be weaker.
Then it’s, to a very slight extent, another person (with the continuum between me and another person being gradual).
Then the qualia would be very slightly different, unless I’m missing something. (To bootstrap the intuition, I would expect my self that chooses vanilla ice-cream over chocolate icecream in one specific situation to have very slightly different feelings and preferences in general, resulting in very slightly different qualia, even if he never encounters that situation.) With many such bugs, it would be the same, but to a greater extent.
I don’t think such thoughts exist (I can always be asked to say out loud what I’m thinking). Generally, I would say that a thought that never, even in principle, influences my output, isn’t possible. (The same principle should apply to trying to replace a thought just by a few bits.)