Anil Seth usefully breaks down consciousness into 3 main components: 1. level of consciousness (anesthesia < deep sleep < awake < psychedelic) 2. contents of consciousness (qualia — external, interoceptive, and mental) 3. consciousness of the self, which can further be broken down into components like feeling ownership of a body, narrative self, and a 1st person perspective.
He shows how each of these can be quite independent. For example, the selfhood of body-ownership can be fucked with using rubber arms and mirrors, narrative-self breaks with amnesia, 1st person perspective breaks in out-of-body experiences which can be induced in VR, even the core feeling of the reality of self can be meditated away.
Qualia such as pain are also very contextual, the same physical sensation can be interpreted positively in the gym or a BDSM dungeon and as acute suffering if it’s unexpected and believed to be caused by injury. Being a self, or thinking about yourself, is also just another perception — a product of your brain’s generative model of reality — like color or pain are. I believe enlightened monks who say they experience selfless bliss, and I think it’s equally likely that chickens experience selfless pain.
Eliezer seems to believe that self-reflection or some other component of selfhood is necessary for the existence of the qualia of pain or suffering. A lot of people believe this simply because they use the word “consciousness” to refer to both (and 40 other things besides). I don’t know if Eliezer is making such a basic mistake, but I’m not sure why else he would believe that selfhood is necessary for suffering.
I agree with pretty much all of that but remark that “deep sleep < awake < psychedelic” is not at all clearly more correct than “deep sleep < psychedelic < awake”. You may feel more aware/conscious/awake/whatever when under the effects of psychedelic drugs, but feeling something doesn’t necessarily make it so.
The ordering is based on measures of neuro-correlates of the level of consciousness like neural entropy or perturbational complexity, not on how groovy it subjectively feels.
It would seems a bit optimistic to call anything a “neuro-correlate of the level of consciousness” simply on the basis that it’s higher for ordinary waking brains than for ordinary sleeping brains. Is there more evidence than that for considering neural entropy or perturbational complexity to be measures of “the level of consciousness”?
(My understanding is that in some sense they’re measuring the amount of information, in some Shannonesque sense, in the state of the brain. Imagine doing something like that with a computer. The figure will—at least, for some plausible ways of doing it—be larger when the computer is actively running some software than when it’s idle, and you might want to say “aha, we’ve found a measure of how much the computer is doing useful work”. But it’s even larger if you arrange to fill its memory with random bits and overwrite them with new random bits once a second, even though that doesn’t mean doing any more useful work. I worry that psychedelics might be doing something more analogous to that than to making your computer actually do more.)
Eliezer seems to believe that self-reflection or some other component of selfhood is necessary for the existence of the qualia of pain or suffering. A lot of people believe this simply because they use the word “consciousness” to refer to both (and 40 other things besides). I don’t know if Eliezer is making such a basic mistake, but I’m not sure why else he would believe that selfhood is necessary for suffering.
It is not my impression that Eliezer believes any such thing for pain, only (perhaps) for suffering. It’s important not to conflate these.
It seems clear to me, at least, that consciousness (in the “subjective, reflective self-awareness” sense) is necessary for suffering; so I don’t think that Eliezer is making any mistake at all (much less a basic mistake!).
Being a self, or thinking about yourself, is also just another perception — a product of your brain’s generative model of reality — like color or pain are
The word “just” is doing a heck of a lot of work here.
I think it’s equally likely that chickens experience selfless pain
Chickens perhaps have “selfless pain”, but to say that they experience anything at all is begging the question!
Eliezer seems to believe that self-reflection or some other component of selfhood is necessary for the existence of the qualia of pain or suffering. A lot of people believe this simply because they use the word “consciousness” to refer to both (and 40 other things besides). I don’t know if Eliezer is making such a basic mistake, but I’m not sure why else he would believe that selfhood is necessary for suffering
I strongly support this. If you are going to explain-away qualia as the result of having a self-model, you need to do more than note that they occur together , or that “conscious” could mean either.
Copying from my Twitter response to Eliezer:
Anil Seth usefully breaks down consciousness into 3 main components:
1. level of consciousness (anesthesia < deep sleep < awake < psychedelic)
2. contents of consciousness (qualia — external, interoceptive, and mental)
3. consciousness of the self, which can further be broken down into components like feeling ownership of a body, narrative self, and a 1st person perspective.
He shows how each of these can be quite independent. For example, the selfhood of body-ownership can be fucked with using rubber arms and mirrors, narrative-self breaks with amnesia, 1st person perspective breaks in out-of-body experiences which can be induced in VR, even the core feeling of the reality of self can be meditated away.
Qualia such as pain are also very contextual, the same physical sensation can be interpreted positively in the gym or a BDSM dungeon and as acute suffering if it’s unexpected and believed to be caused by injury. Being a self, or thinking about yourself, is also just another perception — a product of your brain’s generative model of reality — like color or pain are. I believe enlightened monks who say they experience selfless bliss, and I think it’s equally likely that chickens experience selfless pain.
Eliezer seems to believe that self-reflection or some other component of selfhood is necessary for the existence of the qualia of pain or suffering. A lot of people believe this simply because they use the word “consciousness” to refer to both (and 40 other things besides). I don’t know if Eliezer is making such a basic mistake, but I’m not sure why else he would believe that selfhood is necessary for suffering.
I agree with pretty much all of that but remark that “deep sleep < awake < psychedelic” is not at all clearly more correct than “deep sleep < psychedelic < awake”. You may feel more aware/conscious/awake/whatever when under the effects of psychedelic drugs, but feeling something doesn’t necessarily make it so.
The ordering is based on measures of neuro-correlates of the level of consciousness like neural entropy or perturbational complexity, not on how groovy it subjectively feels.
It would seems a bit optimistic to call anything a “neuro-correlate of the level of consciousness” simply on the basis that it’s higher for ordinary waking brains than for ordinary sleeping brains. Is there more evidence than that for considering neural entropy or perturbational complexity to be measures of “the level of consciousness”?
(My understanding is that in some sense they’re measuring the amount of information, in some Shannonesque sense, in the state of the brain. Imagine doing something like that with a computer. The figure will—at least, for some plausible ways of doing it—be larger when the computer is actively running some software than when it’s idle, and you might want to say “aha, we’ve found a measure of how much the computer is doing useful work”. But it’s even larger if you arrange to fill its memory with random bits and overwrite them with new random bits once a second, even though that doesn’t mean doing any more useful work. I worry that psychedelics might be doing something more analogous to that than to making your computer actually do more.)
It is not my impression that Eliezer believes any such thing for pain, only (perhaps) for suffering. It’s important not to conflate these.
It seems clear to me, at least, that consciousness (in the “subjective, reflective self-awareness” sense) is necessary for suffering; so I don’t think that Eliezer is making any mistake at all (much less a basic mistake!).
The word “just” is doing a heck of a lot of work here.
Chickens perhaps have “selfless pain”, but to say that they experience anything at all is begging the question!
I strongly support this. If you are going to explain-away qualia as the result of having a self-model, you need to do more than note that they occur together , or that “conscious” could mean either.