If I drive a car (especially on known routes) my “auto-pilot” takes over sometimes. I stop at a red light but my mind is primarily focused on visually modeling the buttocks of my girlfriend in various undergarments or none at all. Am I actually “aware” of having stopped at the red light? Probably I was as much”aware” of the red light as a cheetah is aware of eating the carcass of a gazelle. Interestingly my mind seems capable of visually modeling buttocks in my mind’s eye and reading real visual cues like red lights and habitually react to them—all at the same time. It seems I was more aware of my internal visual modeling than of the external visual cue however. In a sense I was aware of both, yet I’m not sure I was “self-aware” at any point, because whatever that means I feel like being self-aware in that situation would actually result in me going “Jesus I should pay more attention to driving, I can still enjoy that buttocks in real life once I actually managed to arrive home unharmed”.
So what’s self-awareness then? I suppose I use that term to mean something roughly like: “thoughts that include a model of myself while modeling a part of reality on-the-fly based on current sensual input”. If my mind is predominantly preoccupied with “daydreaming” aka. creating and playing with a visual or sensual model that is based on manipulating memories rather than real sensual inputs, I don’t feel like the term “self-awareness” should apply here even if that daydreaming encompasses a mental model of myself slapping a booty or whatever.
That’s surely still quite ill-defined and far from maximum usefulness but whenever I’m tempted to use the word self-aware I seem to roughly think of something like that definition. So if we were to use “consciousness” as a synonym for self-awareness (which I’m not a fan of, but quite some people seem to be), maybe my attempt at a definition is a start to get toward something more useful and includes at least some of the “mental features” we seem to care about like “model of oneself” and “interpreting sensory input to create a model of reality”.
The problem is that rats can construct models of reality as well, and these models outlive sensual inputs as well, which is pretty clear from experiments that put rats in mazes. They are stuck for some time in that maze without any exit and any rewards present but during that time they learn the layout of that maze even if it’s empty and even though they are not externally rewarded for doing so. Once you drop a treat in that maze the rats who were able to wander around the maze beforehand know exactly how to get there as fast as possible, while rats new to that particular maze do not (“cognitive revolution” in psychology). Presumably their rat-mind also features some kind of model of themselves, presumably one that mainly features their body not so much their mind.
So to make the concept of self-awareness and perhaps consciousness more useful maybe what we really care about in the end is a mind being able to feature a model of its own mind (and thus what we call “ourselves”).
This is quite interesting… young children and for example gorillas who were taught to communicate in sign language seem to lack a fully developed “theory of mind”. Meaning it seems they can’t conceive of the possibility that other minds contain things theirs does not… well kind of. If they do model other minds, they seem to model them a lot like copies of their own mind, or perhaps just slightly altered copies. Gorillas that can communicate in sign language are perfectly capable of answering questions about i.e. their mood… implying self-awareness that goes somewhat beyond just recognizing their physical reflection in a mirror but also being aware of their own feelings aka. internal experiences. But they never ever seem to get the brilliant idea of asking you a question, presumably because they can’t conceive of the possibility that you know something they don’t. Perhaps here we can draw a sensible line that differentiates between the terms self-awareness and consciousness, where the latter includes the ability to make complex models of the models contained in minds other than your own. I want to stress the word complex, as it doesn’t seem like Gorillas feature no theory of mind, just some kind of more primitive version. It seems they model other minds as versions of their own minds in different states aided by mirror-neurons. Actually upon reflection it’s not so clear humans do it all that differently, seeing how prone we are to anthropomorphism. You know what I’m talking about if you gained new insights from “Three Worlds Collide”—it seems hard to conceive of nonhuman minds and sometimes you end up with real nonsense like King-Kong falling in love with a tiny female human because she has the “universally recognized property” called “beautiful”. Also I sometimes catch myself implicitly modeling other human minds in terms of “like me except for x, y, and z”.
So maybe the reason why Gorillas don’t ask questions isn’t really because they lack a theory of mind, but only that this theory of mind does not include the model of reality of that particular mind they try to model. They seem quite capable when it comes to modeling the emotional states and needs of other minds, but they just seem to lack the insight that those minds also contain different perspectives on reality. Maybe that is what the term consciousness should describe… being able to create a model of a mind other than your own including that mind having a different model of reality than your own. Yeah I think this is it...
This seems to me like a genuinely more useful definition of what consciousness is, because it includes distinguishing features of minds you could actually test with meaningful results as outcomes. At some point children start to riddle you with questions but for gorillas capable of sign language that point just doesn’t seem to arrive. The kinds of “questions” they ask are more along the lines of “Can I I get X” or maybe rather “I want you to give me permission to do X”.
Naturally not everyone can be happy with that definition because they really, really want to be able to say “my dog was unconscious when we visited the vet, but then it regained consciousness when it woke up”, but I submit usefulness should trump habits of speech. Also I can totally conceive of other minds putting forth even more detailed and useful definitions of what the term consciousness should describe, so define away.
Wow, thanks for your comments! I agree that this seems like a way forward in trying to see if the idea of consciousness is worth salvaging (the way being to look for useful features).
I’m starting to think that the concept of consciousness lives or dies by the validity of the concepts of ‘qualia’ or ‘sense of self’, of both of which I already have some suspicion. It looks possible to me that ‘sense of self’ is pretty much a confused way of referring to a thing being good at leveraging its control over itself to effect changes, plus some epiphenomenal leftovers (possibly qualia). It looks like maybe this is similar to what you’re getting at about self-modelling.
[Part 2]
If I drive a car (especially on known routes) my “auto-pilot” takes over sometimes. I stop at a red light but my mind is primarily focused on visually modeling the buttocks of my girlfriend in various undergarments or none at all. Am I actually “aware” of having stopped at the red light? Probably I was as much”aware” of the red light as a cheetah is aware of eating the carcass of a gazelle. Interestingly my mind seems capable of visually modeling buttocks in my mind’s eye and reading real visual cues like red lights and habitually react to them—all at the same time. It seems I was more aware of my internal visual modeling than of the external visual cue however. In a sense I was aware of both, yet I’m not sure I was “self-aware” at any point, because whatever that means I feel like being self-aware in that situation would actually result in me going “Jesus I should pay more attention to driving, I can still enjoy that buttocks in real life once I actually managed to arrive home unharmed”.
So what’s self-awareness then? I suppose I use that term to mean something roughly like: “thoughts that include a model of myself while modeling a part of reality on-the-fly based on current sensual input”. If my mind is predominantly preoccupied with “daydreaming” aka. creating and playing with a visual or sensual model that is based on manipulating memories rather than real sensual inputs, I don’t feel like the term “self-awareness” should apply here even if that daydreaming encompasses a mental model of myself slapping a booty or whatever.
That’s surely still quite ill-defined and far from maximum usefulness but whenever I’m tempted to use the word self-aware I seem to roughly think of something like that definition. So if we were to use “consciousness” as a synonym for self-awareness (which I’m not a fan of, but quite some people seem to be), maybe my attempt at a definition is a start to get toward something more useful and includes at least some of the “mental features” we seem to care about like “model of oneself” and “interpreting sensory input to create a model of reality”.
The problem is that rats can construct models of reality as well, and these models outlive sensual inputs as well, which is pretty clear from experiments that put rats in mazes. They are stuck for some time in that maze without any exit and any rewards present but during that time they learn the layout of that maze even if it’s empty and even though they are not externally rewarded for doing so. Once you drop a treat in that maze the rats who were able to wander around the maze beforehand know exactly how to get there as fast as possible, while rats new to that particular maze do not (“cognitive revolution” in psychology). Presumably their rat-mind also features some kind of model of themselves, presumably one that mainly features their body not so much their mind.
So to make the concept of self-awareness and perhaps consciousness more useful maybe what we really care about in the end is a mind being able to feature a model of its own mind (and thus what we call “ourselves”).
This is quite interesting… young children and for example gorillas who were taught to communicate in sign language seem to lack a fully developed “theory of mind”. Meaning it seems they can’t conceive of the possibility that other minds contain things theirs does not… well kind of. If they do model other minds, they seem to model them a lot like copies of their own mind, or perhaps just slightly altered copies. Gorillas that can communicate in sign language are perfectly capable of answering questions about i.e. their mood… implying self-awareness that goes somewhat beyond just recognizing their physical reflection in a mirror but also being aware of their own feelings aka. internal experiences. But they never ever seem to get the brilliant idea of asking you a question, presumably because they can’t conceive of the possibility that you know something they don’t. Perhaps here we can draw a sensible line that differentiates between the terms self-awareness and consciousness, where the latter includes the ability to make complex models of the models contained in minds other than your own. I want to stress the word complex, as it doesn’t seem like Gorillas feature no theory of mind, just some kind of more primitive version. It seems they model other minds as versions of their own minds in different states aided by mirror-neurons. Actually upon reflection it’s not so clear humans do it all that differently, seeing how prone we are to anthropomorphism. You know what I’m talking about if you gained new insights from “Three Worlds Collide”—it seems hard to conceive of nonhuman minds and sometimes you end up with real nonsense like King-Kong falling in love with a tiny female human because she has the “universally recognized property” called “beautiful”. Also I sometimes catch myself implicitly modeling other human minds in terms of “like me except for x, y, and z”.
So maybe the reason why Gorillas don’t ask questions isn’t really because they lack a theory of mind, but only that this theory of mind does not include the model of reality of that particular mind they try to model. They seem quite capable when it comes to modeling the emotional states and needs of other minds, but they just seem to lack the insight that those minds also contain different perspectives on reality. Maybe that is what the term consciousness should describe… being able to create a model of a mind other than your own including that mind having a different model of reality than your own. Yeah I think this is it...
This seems to me like a genuinely more useful definition of what consciousness is, because it includes distinguishing features of minds you could actually test with meaningful results as outcomes. At some point children start to riddle you with questions but for gorillas capable of sign language that point just doesn’t seem to arrive. The kinds of “questions” they ask are more along the lines of “Can I I get X” or maybe rather “I want you to give me permission to do X”.
Naturally not everyone can be happy with that definition because they really, really want to be able to say “my dog was unconscious when we visited the vet, but then it regained consciousness when it woke up”, but I submit usefulness should trump habits of speech. Also I can totally conceive of other minds putting forth even more detailed and useful definitions of what the term consciousness should describe, so define away.
Wow, thanks for your comments! I agree that this seems like a way forward in trying to see if the idea of consciousness is worth salvaging (the way being to look for useful features).
I’m starting to think that the concept of consciousness lives or dies by the validity of the concepts of ‘qualia’ or ‘sense of self’, of both of which I already have some suspicion. It looks possible to me that ‘sense of self’ is pretty much a confused way of referring to a thing being good at leveraging its control over itself to effect changes, plus some epiphenomenal leftovers (possibly qualia). It looks like maybe this is similar to what you’re getting at about self-modelling.