To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original. If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.
You conclude that consciousness in your scenario cannot have 1 location(s).
If consciousness does not have an anatomical / physical location then it is the stuff of magic and woo.
You conclude that consciousness in your scenario cannot have 0 locations.
However, there are more numbers than those two.
The closest parallel I see to your scenario is a program run on two computers for redundancy (like it is sometimes done in safety-critical systems). It is indeed the same program in the same state but in 2 locations.
The two consciousnesses will diverge if given different input data streams, but they are (at least initially) similar. Given that the state of your brain tomorrow will be different from the state of if today, why do you care about the wellbeing of that human, who is not identical to now-you? Assuming that you care about your tomorrow, why does it make a difference if that human is separated from you by time and not by space (as in your scenario)?
“You conclude that consciousness in your scenario cannot have 1 location(s).” I’m not sure if this is a typo or a misunderstanding. I am very much saying that a single consciousness has a single location, no more no less. It is located in those brain structures which produce it. One consciousness in one specific set of matter. A starting-state-identical consciousness may exist in a separate set of matter. This is a separate consciousness. If they are the same, then the set of matter itself is the same set of matter. The exact same particles/wave-particles/strings/what-have-you. This is an absurdity. Therefore to say 2 consciousnesses are the same consciousness is an absurdity.
“It is indeed the same program in the same state but in 2 locations.” It is not. They are (plural pronoun use) identical (congruent?) programs in identical states in 2 locations. You may choose to equally value both but they are not the same thing i two places.
My consciousness is the awareness of all input and activity of my mind, not the memory. I believe it is, barring brain damage, unchanged in any meaningful way by experience. It is the same consciousness today as next week, regardless of changes in personality, memory, conditioned response imprinting. I care about tomorrow-me because I will experience what he experiences. I care no more about copy-me than I do the general public (with some exceptions if we must interact in the future) because I (the point of passive awareness that is the best definition of “I”) will not experience what he experiences.
I set a question to entirelyuseless above: Basically, does anything given to a copy of you induce you to take a bullet to the head?
Our instincts have evolved in situations where copies did not exist, so taking a bullet in one’s head was always a loss. Regardless of what thought experiments you propose, my insticts will still reject the premises and assume that copies don’t exist and that the information provided to me is false.
If copying would be a feature in our ancient environment, organisms who took the bullet if it saved e.g. two of their copies would have an evolutionary advantage. So their descendants would still hesitate about it (because the information that it will save their two copies could be false; and even if it is right, it would still be better to spend some time looking for a solution that might solve all three copies), but ultimately many of them would accept the deal.
I’m not sure what is the conclusion here. On one hand, the fact that some hypothetical other species would have other values doesn’t say much about our values. On the other hand, the fact that my instincts refuse to accept the premise of your thought experiment doesn’t mean that the answer of my instincts is relevant for your thought experiment.
You conclude that consciousness in your scenario cannot have 1 location(s).
You conclude that consciousness in your scenario cannot have 0 locations.
However, there are more numbers than those two.
The closest parallel I see to your scenario is a program run on two computers for redundancy (like it is sometimes done in safety-critical systems). It is indeed the same program in the same state but in 2 locations.
The two consciousnesses will diverge if given different input data streams, but they are (at least initially) similar. Given that the state of your brain tomorrow will be different from the state of if today, why do you care about the wellbeing of that human, who is not identical to now-you? Assuming that you care about your tomorrow, why does it make a difference if that human is separated from you by time and not by space (as in your scenario)?
Thanks for the reply.
“You conclude that consciousness in your scenario cannot have 1 location(s).” I’m not sure if this is a typo or a misunderstanding. I am very much saying that a single consciousness has a single location, no more no less. It is located in those brain structures which produce it. One consciousness in one specific set of matter. A starting-state-identical consciousness may exist in a separate set of matter. This is a separate consciousness. If they are the same, then the set of matter itself is the same set of matter. The exact same particles/wave-particles/strings/what-have-you. This is an absurdity. Therefore to say 2 consciousnesses are the same consciousness is an absurdity.
“It is indeed the same program in the same state but in 2 locations.” It is not. They are (plural pronoun use) identical (congruent?) programs in identical states in 2 locations. You may choose to equally value both but they are not the same thing i two places.
My consciousness is the awareness of all input and activity of my mind, not the memory. I believe it is, barring brain damage, unchanged in any meaningful way by experience. It is the same consciousness today as next week, regardless of changes in personality, memory, conditioned response imprinting. I care about tomorrow-me because I will experience what he experiences. I care no more about copy-me than I do the general public (with some exceptions if we must interact in the future) because I (the point of passive awareness that is the best definition of “I”) will not experience what he experiences.
I set a question to entirelyuseless above: Basically, does anything given to a copy of you induce you to take a bullet to the head?
Our instincts have evolved in situations where copies did not exist, so taking a bullet in one’s head was always a loss. Regardless of what thought experiments you propose, my insticts will still reject the premises and assume that copies don’t exist and that the information provided to me is false.
If copying would be a feature in our ancient environment, organisms who took the bullet if it saved e.g. two of their copies would have an evolutionary advantage. So their descendants would still hesitate about it (because the information that it will save their two copies could be false; and even if it is right, it would still be better to spend some time looking for a solution that might solve all three copies), but ultimately many of them would accept the deal.
I’m not sure what is the conclusion here. On one hand, the fact that some hypothetical other species would have other values doesn’t say much about our values. On the other hand, the fact that my instincts refuse to accept the premise of your thought experiment doesn’t mean that the answer of my instincts is relevant for your thought experiment.