The Twins
Someone is pregnant with twins. At birth, the twins are taken blindfolded to separate rooms. Both rooms are identical on the interior; outside, there is a ‘1’ printed on one of the doors and a ‘2’ printed on the other. Once inside, the blindfolds are removed, and the twins are left alone in their separate rooms. The twins do not leave their respective rooms during the course of their childhood. The rooms contain machines to take care of them during infant development, and there is a bathroom/shower in each for when they get older. Rather than encountering other humans, they are both raised by the exact same computer programs. These programs give them their education. Slots in the walls give them food and water. Any meal that is distributed to one twin is also given in exact same type and quantity to the other.
The computers are programmed to be interactive, allowing the child to still develop social abilities. Now, imagine if the program asks them a question. The question could be anything, such as “what’s your favorite color?” or “what do you want to do when you leave this room?”. What matters is how the children respond. Would they be exactly the same? The determinist answer would be yes. They have received the exact same sensory perception, possess the same DNA, and the same environmental conditions and diet. One who supports the idea of quantum randomness in cognition might disagree, believing these slight variations could eventually result in observable changes in behavior. Those who understand Chaos Theory would probably also point out small variations, that could not be fully controlled by any test, could also result in differences in behavior. Another problem would be the mutations in their genes that could also affect behavior. While twin DNA might begin by being close to identical, all genes are still subject to mutations over the course of a person’s life.
But what if all small variations could be controlled? And what if quantum variations did not play a role in cognitive differences? What if even their mutations in genes were either controlled, or else also identical? Each child would then truly be identical. They would answer questions the same way, in the same tone. They would get up at the same time, go to the bathroom at the same time. They would think the same, and have the same dreams.
Now imagine if they were both allowed outside their rooms at the same time. The numbers on their doors, however, have both been covered up. The hall between their rooms is plain white, with no other features, save a light bulb in the dead center between their rooms. Encountering the other would be like watching a live feed of a camera recording yourself. When one raised their hand, so would the other. If they were not informed of the other’s existence, they would both express alarm. They might scream at the same time, or run away with the exact same speed. If one said ‘hello’, so would the other. Having a conversation would be impossible, since both would say the same things at the exact same time. The only way they could speak is if they encountered a third party to decide who would speak first.
This could be used as a way to express the problems of free will, since it is evident both twins had no control over who they would grow up to be. But it also raises a larger, more complicated question. Not only would their thoughts and actions be identical, but their subjective experience would also be identical. They would wake up and have the exact same experience of observing the ceiling of their room. They would have the exact same sensation of shock at seeing the other. They would have the same experience when having the exact same dreams. With this level of mirrored consciousness, are the twins even subjectively different people?
Whichever twin you are, you would have the same conscious experience in your room. You would have no way of knowing if you were in the room with the door marked as ‘1’ or ‘2’. What would happen if a third party decided who to speak first? If subjectively they truly are the same person, then it would seem that what was previously one subjective experience, would then split into two. This would mean that each unique subjective experience is singular: that no matter how many copies there are of the experience, for the subject, it is subjectively one single person. But if subjective experience, even when identical, is not singular, then despite being same, they would still be different. They would both have the same experience, which would merely change when a third party was introduced.
While this might seem hypothetical, it has strong implications regarding a future possible technology: brain emulation, or mental uploading. This possible technology involves the scanning of a subjects brain, capturing its complete structure and functions. With this information, a “copy” is then created and “simulated” on a computer. Some claim this would just be a copy, and not the actual person. While others would argue the copy is indistinguishable from the original. The first camp would say the twins are separate, while the second might say the twins are the same.
While this might be hypothetical, if subjective experience is singular, then mental uploading would just be an arbitrary change in hardware. If the second is correct, however, then it wouldn’t matter if the subjective experience of the mental upload was the same as your own. It would have a conscious experience of its own, completely separate from yours.
If subjective experience is singular, then why would the two split when the third party intervened? Because the third party would pick one of the twins to speak first. Let’s say they picked the person from Door 1. If they did this, the twin from Door 1 would now have an experience unique only to herself (I’ll just assume a gender for simplicity). Even if both returned to their respective rooms, the twin behind Door 1 would have had a different experience. Her memory of the encounter would be different. She might even identify herself with being the one who was chosen first. This small change would be enough to create a unique person.
Now, what if no third party intervened? The twins would fail to communicate, and let’s say they eventually gave up and returned to their respective rooms. At this point, the computer program instantly kills one of the twins. It is done so fast, the twin never sees it coming, nor feels it happen. One moment she is alive, the next she is dead. If they are subjectively the same, that consciousness is singular, it seems this would make no difference. Nothing has been altered in the subjective experience for the other, so the consciousness itself would be the same for her.
But what if a third party did intervene, and chose the women from Door 1 to speak first? After this, the two go back to their respective rooms. The program then kills the women in door 1, instantly and without being noticed. Now it would seem subjectively that person has ceased to exist. She wouldn’t be subjectively the same, since the other twin would now have a different experience from the twin behind Door 1. One consciousness would continue, the other would not. But if this is the case, what is it that really killed the consciousness of the twin behind Door 1? It would seem it was the third party. If the third party had not intervened, the subjective experiences of the twins would still be identical. But because the intervention happened, a unique experience was destroyed.
But isn’t this happening all the time? Every time you wake up, every new memory you make, every letter I type when writing this, all of these things alter subjective experience. Someone’s unique experience of watching a sunset is then changed into an experience of the stars coming out. If this is the case, would it mean we are constantly destroying our own subjective experience just by changing it?
These approaches take an “all or nothing” stance. That all of the qualities of consciousness are either destroyed or preserved. Instead of being one single experience, consciousness is instead made up of many different qualities that result in a seemingly unified experience. Perhaps your conscious experience does change after watching a sunset, but you will probably also experience other sunsets in the future. None of them will be exactly the same, and your experience of them will vary depending on your mood or what you are thinking at the time, but there will also be similarities. Many of the sun’s photons will enter your retina the same way, your neurons will recreate a similar image. It may not be an identical copy, but it may contain many identical qualities.
If this is the case, even after the twin behind Door 1 is killed, subjectively some experiences might still exist. The twin behind Door 2’s memory will alter some subjective experiences, but perhaps not all. Her experience of waking up and staring at the ceiling might be the same as it would have been without the third party intervention. In this case, some subjective qualities would be the same, even if no twin had been killed, or even if no third party had intervened.
If this is the case, are we sharing qualities of subjective experience at times? Is listening to a piece by Mozart in some way the same for us as it is for others? And if it is, is the quality of the subjective experience in some way identical? How does this way of thinking work with the perception of a linear passage of time? Is the experience timeless, or dependent on future repetition to continue to exist? How would this be impacted by the Many Worlds Theory of quantum mechanics, or the Multiverse, where there might be several subjectively similar ‘yous’ out there, sharing many of the same subjective qualities? Too little is known about consciousness yet to be sure about any of this.
Put me in the chaos camp—minor variations compound until they’re distinct people. And if you eliminate even tiny variance, they STILL diverge before they meet each other: one turned left and the other right out of their rooms.
I got something from the essay related to people who are brain uploads or RL+LLM or other programs that end up “humanlike but not actual meat people”.
There’s not just a question of whether chaos exists in our hardware, there’s a question of whether it is good that this chaos is in our hardware.
Should we want to pay costs to retain it, by adding it in on purpose for digital people, when retaining it might be expensive?
For meat people, the cheap thing is to be subject to it by default… but if should NOT put efforts into ADDING it for digital people, then maybe we should should also put efforts into reducing the influence of sensitive dependence on random input on ourselves.
This helps explain why determinism is weird.