Why would authors model fictional minds with unusual high-level skills that they do not possess and, in all likelihood, have neither heard of from other sources nor conceived independently?
By way of analogy, it takes a reasonably uncommon bit of knowledge to be aware that human vision includes blind spots and to know how they work. Even though every single human has blind spots (if indeed they can see at all), and can determine the existence of these blind spots through easy tests, many authors will not think to write fictional characters with visual blind spots, because it is not obvious to people going about their ordinary lives and looking at things that they exist. The author knows what’s within visual range of the character, and if the character has his or her eyes open, (s)he can see all of those things.
Luminosity is rare. Knowledge that luminosity is rare is rare. Most people trust in naive introspection. They believe the first approximations that pop into their heads when they think about themselves, in the same way that they accept what their visual cortices tell them. And they know what’s in the character’s mind, and if the character introspects, (s)he can see all of those things.
I think I might not get it. How could someone be introspective but not luminous? I haven’t put any real effort into reading the luminosity sequence because it seemed so fundamentally obvious to me. It is possible that I am low-level and deceived in ways that I am totally unaware of, but it’s always been easy—trivial, even—for me to see through my brain’s “fake explanations” and understand the actual reasons behind my thoughts or traits.
I am going to assume provisionally that you do not mind answers from people other than Alicorn.
Situations that have probably caused me to become less aware of the true reasons for my thoughts and actions:
needing to stand up for myself, i.e., to argue on my own behalf, in what a friend of mine referred to as a “pecking situation,” i.e., one where ordinary people without a strong commitment to epistemic purity constantly try to one-up me and each other;
needing to sell myself, e.g., in a long series of job interviews or dates;
living for months with chronic pain;
getting older (I am 49);
being very afraid
(end of list).
The changes I found myself required to make to myself over the months and years to become marginally competent in the “pecking situation” almost certainly interfered with my self-awareness though I never despaired of my eventually regaining the self-awareness with enough work.
Something caused me to pay much less attention to my felt sense and my moment-to-moment emotional reactions, and my having lived for years with chronic pain is one of the most likely causes.
I think that’s not necessarily a fault; it’s easy to grasp the idea of luminosity, but sometimes people don’t do it. Can you do it under stress? It’s simple—but not necessarily easy.
It is possible that I am low-level and deceived in ways that I am totally unaware of, but it’s always been easy—trivial, even—for me to see through my brain’s “fake explanations” and understand the actual reasons behind my thoughts or traits.
Warning: bombastic (but sincere) sentence ahead.
[There were two paragraphs here, but Less Wrong is better without them. It is not that I concluded that on second thought, what I wrote here is wrong, but rather that if I just assert it in a shorthand way like I did, and do not provide any arguments for why I believe it to be true, well, most are going to find it ridiculous, and I do not have time to advance the arguments or even to explain with sufficient care what my assertion is. Another factor that had a slight effect on my decision is that I have not read Alicorn’s luminosity sequence.]
Why would authors model fictional minds with unusual high-level skills that they do not possess and, in all likelihood, have neither heard of from other sources nor conceived independently?
Because the lack of skill is not transparent.
By way of analogy, it takes a reasonably uncommon bit of knowledge to be aware that human vision includes blind spots and to know how they work. Even though every single human has blind spots (if indeed they can see at all), and can determine the existence of these blind spots through easy tests, many authors will not think to write fictional characters with visual blind spots, because it is not obvious to people going about their ordinary lives and looking at things that they exist. The author knows what’s within visual range of the character, and if the character has his or her eyes open, (s)he can see all of those things.
Luminosity is rare. Knowledge that luminosity is rare is rare. Most people trust in naive introspection. They believe the first approximations that pop into their heads when they think about themselves, in the same way that they accept what their visual cortices tell them. And they know what’s in the character’s mind, and if the character introspects, (s)he can see all of those things.
I think I might not get it. How could someone be introspective but not luminous? I haven’t put any real effort into reading the luminosity sequence because it seemed so fundamentally obvious to me. It is possible that I am low-level and deceived in ways that I am totally unaware of, but it’s always been easy—trivial, even—for me to see through my brain’s “fake explanations” and understand the actual reasons behind my thoughts or traits.
Sounds like you’re an anomaly, if what you say is true. Naive introspection is generally fallible.
What are the standard failure modes that you’ve encountered? I need to test myself more thoroughly.
I am going to assume provisionally that you do not mind answers from people other than Alicorn.
Situations that have probably caused me to become less aware of the true reasons for my thoughts and actions:
needing to stand up for myself, i.e., to argue on my own behalf, in what a friend of mine referred to as a “pecking situation,” i.e., one where ordinary people without a strong commitment to epistemic purity constantly try to one-up me and each other;
needing to sell myself, e.g., in a long series of job interviews or dates;
living for months with chronic pain;
getting older (I am 49);
being very afraid
(end of list).
The changes I found myself required to make to myself over the months and years to become marginally competent in the “pecking situation” almost certainly interfered with my self-awareness though I never despaired of my eventually regaining the self-awareness with enough work.
Something caused me to pay much less attention to my felt sense and my moment-to-moment emotional reactions, and my having lived for years with chronic pain is one of the most likely causes.
There’s some “obviousness” to it, yes.
I think that’s not necessarily a fault; it’s easy to grasp the idea of luminosity, but sometimes people don’t do it. Can you do it under stress?
It’s simple—but not necessarily easy.
Well, there’s the joke that authors understand themselves more than anyone else does—not necessarily better, just more.
I do think it’s possible to be fascinated by one’s own internal processes while not noticing a few hot button areas.
katydee:
Warning: bombastic (but sincere) sentence ahead.
[There were two paragraphs here, but Less Wrong is better without them. It is not that I concluded that on second thought, what I wrote here is wrong, but rather that if I just assert it in a shorthand way like I did, and do not provide any arguments for why I believe it to be true, well, most are going to find it ridiculous, and I do not have time to advance the arguments or even to explain with sufficient care what my assertion is. Another factor that had a slight effect on my decision is that I have not read Alicorn’s luminosity sequence.]