Yes, the answer to that question is mind dependent. But this isn’t really a big deal. If a person decides that there is an 80% probability that a banana will appear in front of them, their P(A banana will appear in front of me) is 0.8. If a Midwesterner decides that they are guaranteed not to be in Kansas after waking up from a coma, their P(I am in Kansas) is about 0. If I decide that I am definitely not a brain in a vat, my P(I am a vat brain) is about 0.
I suspect there is still some way to get non-stupid probabilities out of this mess from a series of observations by observer-moments, though I don’t know how to do it. Intuitively, the problem with deciding that your P(I am a vat brain) is 0 is that your pre-existing series of observations could have been made by a vat brain.
To me signing up for superpower surgery can raise “if there exists a me, it is superpowered” to arbitarily high but it would at the same time lower “after the surgery there is a me” at the same rate.
This would kinda leave a funny edgecase where a brain in a vat could correctly conclude that “I don’t exist” if it finds evidence that nothing that fits it’s self image exists in the world (ie beings with hands etc). It would still be blatantly obvious that something is having the thought and it would be really nice if “I” would refer to that thing regardless of how you picture yourself.
You could have a situation where you are a brain in a vat in your lap with all your sensory inputs being conveyed by a traditional body. It would still be pretty challenging to determine whether you are your skull or the fishbowl in your hands. Maybe the multilayered use of me in the previous sentence points at the right way? So what happens to the thing you are now (or your extended-you) is a different question on what you will become (your core-you). That way the only way for core-you to terminate would be to not to have thoughts. Breaking the extended-you would thus not terminate your toughts and core-you would still be core-you.
Yes, the answer to that question is mind dependent. But this isn’t really a big deal. If a person decides that there is an 80% probability that a banana will appear in front of them, their P(A banana will appear in front of me) is 0.8. If a Midwesterner decides that they are guaranteed not to be in Kansas after waking up from a coma, their P(I am in Kansas) is about 0. If I decide that I am definitely not a brain in a vat, my P(I am a vat brain) is about 0.
I suspect there is still some way to get non-stupid probabilities out of this mess from a series of observations by observer-moments, though I don’t know how to do it. Intuitively, the problem with deciding that your P(I am a vat brain) is 0 is that your pre-existing series of observations could have been made by a vat brain.
To me signing up for superpower surgery can raise “if there exists a me, it is superpowered” to arbitarily high but it would at the same time lower “after the surgery there is a me” at the same rate.
This would kinda leave a funny edgecase where a brain in a vat could correctly conclude that “I don’t exist” if it finds evidence that nothing that fits it’s self image exists in the world (ie beings with hands etc). It would still be blatantly obvious that something is having the thought and it would be really nice if “I” would refer to that thing regardless of how you picture yourself.
You could have a situation where you are a brain in a vat in your lap with all your sensory inputs being conveyed by a traditional body. It would still be pretty challenging to determine whether you are your skull or the fishbowl in your hands. Maybe the multilayered use of me in the previous sentence points at the right way? So what happens to the thing you are now (or your extended-you) is a different question on what you will become (your core-you). That way the only way for core-you to terminate would be to not to have thoughts. Breaking the extended-you would thus not terminate your toughts and core-you would still be core-you.