Yes. I would consider those states to be “unconscious”. I am not using “conscious” or “unconscious” as pejorative terms or as terms with any type of value, but purely as descriptive terms that describe the state of an entity. If an entity is not self-aware in the moment, then it is not conscious.
People are not self-aware of the data processing their visual cortex is doing (at least I am not). When you are not aware of the data processing you are doing, the outcome of that data processing is “transparent” to you, that is the output is achieved without an understanding of the path by which the output was achieved. Because you don’t have the ability to influence the data processing your visual cortex is doing, the output is susceptible to optical illusions.
Dissociation is not uncommon. In thinking about it, I think I dissociate quite a bit, and that it is fairly easy for me to dissociate. I do my best intellectual work when I am in what I call a “dissociative focus”. Where I really am quite oblivious to a lot of extraneous things and even about my physical state, hunger, fatigue, those kinds of things.
I think that entering a dissociative state is not uncommon, particularly under conditions of very high stress. I think there is a reason for that, under conditions of very high stress, all computational resources of the brain are needed to deal with what ever is causing that stress. Spending computational resources being conscious or self-aware is a luxury that an entity can’t afford while it is “running from a bear” (to use my favorite extreme stress state).
I haven’t looked at the living luminously sequences carefully, but I think I mostly disagree with it as something to strive for. It is ok, and if that is what you want to do that is fine, but I don’t aspire to think that way. Trying to think that way would interfere with what I am trying to accomplish.
I see living while being extremely conscious of self (i.e. what I understand to be the luminous state), and being dissociated from being conscious as two extremes along a continuum, what I consider thinking with your “theory of mind” (the self-conscious luminous state) and thinking with your “theory of reality”, what I consider to be the dissociative state. I discuss that in great detail on my blog about autism.
If you are not in a mode where you are thinking about entities, then you are not using your “theory of mind”. If you are thinking about things in purely non-anthropomorphic terms, you are not using your “theory of mind”.
I think these two different states are useful for thinking about different kinds of problems. Interpersonal problems, interactions with other people, communication are best dealt with by the “theory of mind”. All the examples in the Seven Shining Stories are what I would consider pretty much pure theory of mind-type problems. Theory of reality-type problems are like the traveling salesman problem, multiplying numbers, running more algorithmey-type problems like counting. Problems where there is little or no interpersonal or communication component.
Inklesspen’s argument (which you said you agreed with) was was that my belief in a lack of personal identity continuity was incompatible with being unwilling to accept a painless death and that this constitutes a fatal flaw in my argument.
If there are things you want to accomplish and where you believe the most effective way for you to accomplish those things is via uploading what you believe will be a version of your identity into an electronic gizmo; all I can say is good luck with that. You are welcome to your beliefs.
In no way does that address Inklesspen’s argument that my unwillingness to immediately experience a painless death somehow contradicts or disproves my belief in a lack of personal identity continuity or constitutes a flaw in my argument. I don’t associate my “identity” with my consciousness, I associate my identity with my body and especially with my brain, but it is coupled to the rest of it. That my consciousness is not the same from day to day is not an issue for me. My body very much is alive and is quite good at doing things. It would be a waste to kill it. That it is not static is actually quite a feature, I can learn and do new things.
I have an actual body with which I can do actual things and with which I am doing actual things. All that can be said about the uploading you want to do is that it is very hypothetical. There might be electronic gizmos in the future that might be able to hold a simulation of an identity that might be able to be extracted from a human brain and that electronic gizmo might then be able to do things.
Your belief that you will accomplish things once a version of your identity is uploaded into an electronic gizmo is about you and your beliefs. It is not in the slightest bit about me or my reasoning that a belief in personal identity continuity is an illusion.
People professing a belief in an actual Heaven where they will receive actual rewards doesn’t constitute evidence that such beliefs are not illusory either. Such people are usually unwilling to allow themselves to be killed to reach those rewards sooner. That unwillingness does not prove their beliefs are illusory any more than a willingness to be killed would prove they were non-illusory. The members of the Heaven’s Gate group believed they were uploading their identities to some kind of Mother Ship electronic gizmo and they were willing to take cyanide to accelerate the process. Their willingness to take poison does not constitute evidence (to me) that their beliefs were not illusory.