There might be some incomplete separation on whether you truly think of memories not being part of conciousness. Lets say that we keep your “awareness” intact but inject and eject memories out of it. lets do so in a cyclical manner in that you remember every other day there being your “odd day memories” and “even day memories”. Now if ask you about what you did yesterday you should not be able to answer with knowledge (you might guess but whatever). Can we still coherently hold that you are still just 1 awareness with 2 sets of memories? Or have you infact become two awarenesses?
We could then do a information split where every half a second your brain has only access to ears and memory set 1 and the other half to eyes and memory set 2. Are you still 1 or 2 awarenesses? How about if we run those in parallel so that ears is connected to memory set 1 and eyes in memory set 2 so there is no switching but no crossover.
If you were a upload we could have your “ear module” on one side of the brain and the “eye module” ont he other side of the brain. Suppose there is a wire connecting them so that the whole is isomorphic to your human based cognitition (I don’t now what information would be transferred over the wire but parring ping times it should be doable somehow (and that migth be overcomeable by using something faster than neurotransmitters)). You should be 1 awareness now right? Now if we cut the wire only the transfer of information should be blocked and no “awareness” removed (those are supposedly in the black boxes). How many awarenesses are you with the cord cut? You won’t be symmetric (ambigity of english pronouns works great here) but wouldn’t the cord cutting be equilvalent to a software separation of forbidding memory crossover?
Then there is the case of the sheet brains of (Eborians?) form the Elizier texts. Suppose that you are implemented in hardware where each wire has a subsection that divivides them into two. While the subsection is in the “open” state it allows electrons to freely pass over. However when it is in the “close” state the electrons stay at their own sides. In “open” it functions as 1 wire and in “close” as 2 wires. When we transition from open to close the divider in effect makes two separate but identical circuits that should stay in synch. However have we doubled the amount of awarenesses? What is the difference in engaging the subsection and building an identical circuitry next to the old one?
If you move from “close” to “open” does it synch the (possibly 2) awareness(es) or does it fuse them into 1?
There are possibly quite real world analogs to these conditions. Its hard to remember your dreams and in the dreams its hard to remember you were going to sleep just hours ago. And people have been lobotomized and atleast one such lobotomized person on the basis of a brainscan answer oppositely based on lobe to the question “do you believe in God?” (comparable to the polygraph standard of “honest”) (I remember someone talking about making a joke about it how it makes a puzzle for theoligians as in whether the person fullfills the “beliefs in god” as in whether he is going to hell or heaven)
Great thought experiment, thanks. I do define consciousness as a passively aware thing, totally independent of memory. The demented, the delirious, the brain damaged all have (unless those structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.
In your 2 meat scenarios I still count one consciousness, being aware of different things at different times.
In wire form, if those physical structures (wires) on which consciousness operations occur (no other wires matter to the question) are cleaved, two consciousness exist. When their functionality is re-joined, one consciousness exists. Neither, I suppose, can be considered “original” nor “copy”, which queers the pot a bit vis a vis my original thesis. But then if, during a split, B is told that it will be destroyed while A lives on. I don’t imaging B will get much consolation, if it is programmed to feel such things. Alternately, split them and ask A if which one of the two should be destroyed, I can imaging it wouldn’t choose B.
What if the running of two programs is causally separated but runs on common hardware? And when the circuits are separated their function isn’t changed. Is not the entity and awareness of A+B still intact? Can the awareness be compromised without altering function?
Also are lobotomized persons two awarenesses? What is the relevant difference to the subsection circuitry?
I’m not sure I follow your first point. Please clarify for me if my answer doesn’t cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent.
If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration.
I don’t believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.
There is the difficult edge case when the systems are connected but they don’t need the connection to function. Separated or fused the outcome of the data processing is going to be the same in the subsection thought experiment. If the gate is open and individual electrons are free to go on either edge of the wire it could be seen as similar to having software separation within one hardware. Its not their influence on each other would be impossible but they just in fact don’t. If you merely change what is possible but what they infact end up doing remains same it would be pretty strange if that changed the number of awarenesses.
I seem to be getting ther vibe that you believe that awareness is singular in the sense that you either have it or don’t and you can’t have it fragment into pieces.
I am thinking waht kind of information processing good awareness in your opinion. Some times organizations get some task that is infact been carried out by small teams. When those teams undercommmunicate misunderstandings can happen. In a good team there is sufficient communication that what is going to happen is common knowledge to the point atleast that no contradictory plans exist within different team members. In a shattered corporation there is no “corporation official line” while in a well coordinated corporation there migth be one even if it is more narrow than any one members full opinion. While the awareness of individual team members is pretty plain can the corporation become separately aware from its members? With the brain the puzzle is kinda similar but instead the pieces are pretty plainly not aware.
It does seem to me that you chase the awareness into the unknown black box. In the corporation metaphor the CEOs awareness counts as the corporations awareness to the extent there is point to discuss about it. However in the “unaware pieces” picture this would lead into some version of panpsychism (or some kind of more less symmetrical version where there is a distinquished ontological class that has awareness as an elementary property)
There might be some incomplete separation on whether you truly think of memories not being part of conciousness. Lets say that we keep your “awareness” intact but inject and eject memories out of it. lets do so in a cyclical manner in that you remember every other day there being your “odd day memories” and “even day memories”. Now if ask you about what you did yesterday you should not be able to answer with knowledge (you might guess but whatever). Can we still coherently hold that you are still just 1 awareness with 2 sets of memories? Or have you infact become two awarenesses?
We could then do a information split where every half a second your brain has only access to ears and memory set 1 and the other half to eyes and memory set 2. Are you still 1 or 2 awarenesses? How about if we run those in parallel so that ears is connected to memory set 1 and eyes in memory set 2 so there is no switching but no crossover.
If you were a upload we could have your “ear module” on one side of the brain and the “eye module” ont he other side of the brain. Suppose there is a wire connecting them so that the whole is isomorphic to your human based cognitition (I don’t now what information would be transferred over the wire but parring ping times it should be doable somehow (and that migth be overcomeable by using something faster than neurotransmitters)). You should be 1 awareness now right? Now if we cut the wire only the transfer of information should be blocked and no “awareness” removed (those are supposedly in the black boxes). How many awarenesses are you with the cord cut? You won’t be symmetric (ambigity of english pronouns works great here) but wouldn’t the cord cutting be equilvalent to a software separation of forbidding memory crossover?
Then there is the case of the sheet brains of (Eborians?) form the Elizier texts. Suppose that you are implemented in hardware where each wire has a subsection that divivides them into two. While the subsection is in the “open” state it allows electrons to freely pass over. However when it is in the “close” state the electrons stay at their own sides. In “open” it functions as 1 wire and in “close” as 2 wires. When we transition from open to close the divider in effect makes two separate but identical circuits that should stay in synch. However have we doubled the amount of awarenesses? What is the difference in engaging the subsection and building an identical circuitry next to the old one?
If you move from “close” to “open” does it synch the (possibly 2) awareness(es) or does it fuse them into 1?
There are possibly quite real world analogs to these conditions. Its hard to remember your dreams and in the dreams its hard to remember you were going to sleep just hours ago. And people have been lobotomized and atleast one such lobotomized person on the basis of a brainscan answer oppositely based on lobe to the question “do you believe in God?” (comparable to the polygraph standard of “honest”) (I remember someone talking about making a joke about it how it makes a puzzle for theoligians as in whether the person fullfills the “beliefs in god” as in whether he is going to hell or heaven)
Great thought experiment, thanks. I do define consciousness as a passively aware thing, totally independent of memory. The demented, the delirious, the brain damaged all have (unless those structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.
In your 2 meat scenarios I still count one consciousness, being aware of different things at different times.
In wire form, if those physical structures (wires) on which consciousness operations occur (no other wires matter to the question) are cleaved, two consciousness exist. When their functionality is re-joined, one consciousness exists. Neither, I suppose, can be considered “original” nor “copy”, which queers the pot a bit vis a vis my original thesis. But then if, during a split, B is told that it will be destroyed while A lives on. I don’t imaging B will get much consolation, if it is programmed to feel such things. Alternately, split them and ask A if which one of the two should be destroyed, I can imaging it wouldn’t choose B.
What if the running of two programs is causally separated but runs on common hardware? And when the circuits are separated their function isn’t changed. Is not the entity and awareness of A+B still intact? Can the awareness be compromised without altering function?
Also are lobotomized persons two awarenesses? What is the relevant difference to the subsection circuitry?
I’m not sure I follow your first point. Please clarify for me if my answer doesn’t cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent.
If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration.
I don’t believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.
There is the difficult edge case when the systems are connected but they don’t need the connection to function. Separated or fused the outcome of the data processing is going to be the same in the subsection thought experiment. If the gate is open and individual electrons are free to go on either edge of the wire it could be seen as similar to having software separation within one hardware. Its not their influence on each other would be impossible but they just in fact don’t. If you merely change what is possible but what they infact end up doing remains same it would be pretty strange if that changed the number of awarenesses.
I seem to be getting ther vibe that you believe that awareness is singular in the sense that you either have it or don’t and you can’t have it fragment into pieces.
I am thinking waht kind of information processing good awareness in your opinion. Some times organizations get some task that is infact been carried out by small teams. When those teams undercommmunicate misunderstandings can happen. In a good team there is sufficient communication that what is going to happen is common knowledge to the point atleast that no contradictory plans exist within different team members. In a shattered corporation there is no “corporation official line” while in a well coordinated corporation there migth be one even if it is more narrow than any one members full opinion. While the awareness of individual team members is pretty plain can the corporation become separately aware from its members? With the brain the puzzle is kinda similar but instead the pieces are pretty plainly not aware.
It does seem to me that you chase the awareness into the unknown black box. In the corporation metaphor the CEOs awareness counts as the corporations awareness to the extent there is point to discuss about it. However in the “unaware pieces” picture this would lead into some version of panpsychism (or some kind of more less symmetrical version where there is a distinquished ontological class that has awareness as an elementary property)