I like this post, I also doubt there is much coherence let alone usefulness to be found in most of the currently prevailing concepts of what consciousness is.
I prefer to think of words and the definitions of those words as micro-models of reality that can be evaluated in terms of their usefulness, especially in building more complex models capable of predictions. As in your excellent example of gender, words and definitions basically just carve complex features of reality into manageable chunks at the cost of losing information—there is a trade-off and getting it right enhances the usefulness of words and the concepts behind them. In 99.9% of cases the concept of biological gender is perfectly applicable to everyday life and totally a “good enough” model of reality, as long as you have the insight that hermaphrodites are actually also a real thing. In a case where you have to deal with one, the correct reaction is to adopt a more complex model of reality instead of trying to fit a complex reality into a model that is designed to compress information into categories with some inevitable information loss. Biological gender is a really good high-level model of reality, because it draws its imaginary line in an area where very few exceptions actually exist in reality. It’s an especially sharp distinction if you think of biological gender as having testiclular/ovarial(?) tissue, but in very rare instances this model will still miss to encompass rare special cases where the complexity of reality defies your model. “Mental gender” seems to be a rather different yet in most cases more useful everyday concept of gender, because we usually care more about creating models of other peoples’ minds than about whether or not they have testicular or ovarian tissues—outside of curiosity or medical context. The lesson here is that your model of reality will always fall short no matter where you draw the line (at least at “higher levels” of reality, “low-level” models of atoms or particles are much more precise and unambiguous, than models of “higher-level” things like “persons” (When exactly does a fetus become a person?) or “societies” (are two people a society? How about eight?) and I’m quite sure any models of whatever “consciousness” is face the same problem).
In other words I think it’s better to think of models/maps in terms of their usefulness, not in terms of right or wrong. In my opinion the job of a model is to make something understandable and more predictable, the job of a model is not to “reflect reality as closely as possible”, especially of complex higher-level things. The “perfect model” in the latter sense would essentially be a perfect carbon copy of a real thing and tells you exactly as much -or as little- as the thing you’re trying to model already does anyway. The usefulness of a model lies in compacting information enough to become understandable while also predicting outcomes better than competing models.
If you accept that notion, the question really becomes how should the term consciousness be defined to be useful and describe / differentiate something we actually care about. So we’d like to represent a part of reality we care about in a way that compresses information while retaining a high level of usefulness—meaning we can understand it but without cutting away vital parts and ideally in terms of being able to make predictions if we were to integrate the concept of consciousness into models with the potential to predict outcomes.
So which part of reality should the term consciousness try to model in order to be useful? I find it highly problematic and close to maximum uselessness to think of consciousness as some kind of continuum on which we rank information processing in living things/agents. Some people actually really think of consciousness as rocks having 0, bacteria having perhaps 0.001, bees having 0.01, rats having 0.1, dogs maybe 0.2, humans perhaps 0.5. Maximum uselessness I would argue as it tells you nothing. Why not just substitute consciousness with some notion of “maximum calculations per second” then and reserve the term consciousness for something we actually care about instead of wasting such a nice word on something we don’t really care so much about—and more importantly on something we can already express with other words and concepts like “information processing”.
What’s funny about consciousness is that no one really agrees what exactly the definition should be but somehow everyone agrees that it’s really really important. Why do we care so much about something we seemingly know close to nothing about? Seriously though why do we?
Look at all those hilarious “quantum consciousness” or “become more conscious” concepts peddled by the self-help industry complex. Possessing consciousness seems really high status nowadays, unlike say… all those lousy low-status life forms like frogs and bees and mice. The idea that somehow you can improve your consciousness seems very appealing, because if insects and birds have little or none of that thing called consciousness, and people surely have some of that thing called consciousness, then logically if I can get more of that awesome “consciousness” than my neighbor I’m superior than him in just the same way I’m superior to a frog. Really, self-help opened my eyes to how unconsciously I lived my life once and nowadays I feel strongly about helping all those low-consciousness people realize their full potential and I do my best to help them become more conscious beings...
It sounds ridiculous but couldn’t that be part of why consciousness is so damn important to us even if he have no clue what exactly it is? I may have no idea what that consciousness is but somehow I really insist that I have it, I mean if everyone else says s/he has it I surely have it too, can’t be left out. Whatever consciousness is, we usually agree that bacteria doesn’t have it and we do so it must be important if some kinds of life have it and others don’t.
Okay let’s get serious again. What distinct features of minds exactly do we actually explicitly (and perhaps implicitly) care about, when we usually attempt employ that murky concept of consciousness? Hmm… well if we care about it, we might gain insight into what exactly it is we care about by thinking about which specific situations make us choose to employ that word and maybe from there we can distill why we seem to care so much.
Well whatever consciousness means, most people agree the concept of awareness seems highly related or somehow relevant to it. Consciousness is often used as a synonym for self-awareness, but what on earth is that exactly? (And why would we ever need two words for the exact same thing?) For some people it means having “internal experiences”, for others “being aware of having internal experiences”, which doesn’t quite seem to be the same thing from where I stand… but where do these intuitions about something I seemingly know nothing about come from? Probably personal experiences...
Sometimes I read a paragraph and my mind stars wandering and daydreaming until I snap out of it and think to myself “Jesus I was totally gone for a second where was I again?” I realize my eyes are at the bottom of the paragraph already and it seems like I semi-remember that they kept wandering over the letters and words as if I was actually reading them… without being aware. And moreover my mind reawakening seems to have been triggered by arriving at the end of that paragraph and going “now what?”, seemingly out of habit because I usually stop at the end of a paragraph and consider if I actually “got” what I read there. And sure enough upon rereading the paragraph, it seems very familiar to me… but I was not at all sure whether I read it or not just a few seconds ago, and I’d say whatever the word “self-aware” means shouldn’t really include that experience (or non-experience?) I just described. But did I lack “awareness” or just “self-awareness” in that example? Hmm...
If I drive a car (especially on known routes) my “auto-pilot” takes over sometimes. I stop at a red light but my mind is primarily focused on visually modeling the buttocks of my girlfriend in various undergarments or none at all. Am I actually “aware” of having stopped at the red light? Probably I was as much”aware” of the red light as a cheetah is aware of eating the carcass of a gazelle. Interestingly my mind seems capable of visually modeling buttocks in my mind’s eye and reading real visual cues like red lights and habitually react to them—all at the same time. It seems I was more aware of my internal visual modeling than of the external visual cue however. In a sense I was aware of both, yet I’m not sure I was “self-aware” at any point, because whatever that means I feel like being self-aware in that situation would actually result in me going “Jesus I should pay more attention to driving, I can still enjoy that buttocks in real life once I actually managed to arrive home unharmed”.
So what’s self-awareness then? I suppose I use that term to mean something roughly like: “thoughts that include a model of myself while modeling a part of reality on-the-fly based on current sensual input”. If my mind is predominantly preoccupied with “daydreaming” aka. creating and playing with a visual or sensual model that is based on manipulating memories rather than real sensual inputs, I don’t feel like the term “self-awareness” should apply here even if that daydreaming encompasses a mental model of myself slapping a booty or whatever.
That’s surely still quite ill-defined and far from maximum usefulness but whenever I’m tempted to use the word self-aware I seem to roughly think of something like that definition. So if we were to use “consciousness” as a synonym for self-awareness (which I’m not a fan of, but quite some people seem to be), maybe my attempt at a definition is a start to get toward something more useful and includes at least some of the “mental features” we seem to care about like “model of oneself” and “interpreting sensory input to create a model of reality”.
The problem is that rats can construct models of reality as well, and these models outlive sensual inputs as well, which is pretty clear from experiments that put rats in mazes. They are stuck for some time in that maze without any exit and any rewards present but during that time they learn the layout of that maze even if it’s empty and even though they are not externally rewarded for doing so. Once you drop a treat in that maze the rats who were able to wander around the maze beforehand know exactly how to get there as fast as possible, while rats new to that particular maze do not (“cognitive revolution” in psychology). Presumably their rat-mind also features some kind of model of themselves, presumably one that mainly features their body not so much their mind.
So to make the concept of self-awareness and perhaps consciousness more useful maybe what we really care about in the end is a mind being able to feature a model of its own mind (and thus what we call “ourselves”).
This is quite interesting… young children and for example gorillas who were taught to communicate in sign language seem to lack a fully developed “theory of mind”. Meaning it seems they can’t conceive of the possibility that other minds contain things theirs does not… well kind of. If they do model other minds, they seem to model them a lot like copies of their own mind, or perhaps just slightly altered copies. Gorillas that can communicate in sign language are perfectly capable of answering questions about i.e. their mood… implying self-awareness that goes somewhat beyond just recognizing their physical reflection in a mirror but also being aware of their own feelings aka. internal experiences. But they never ever seem to get the brilliant idea of asking you a question, presumably because they can’t conceive of the possibility that you know something they don’t. Perhaps here we can draw a sensible line that differentiates between the terms self-awareness and consciousness, where the latter includes the ability to make complex models of the models contained in minds other than your own. I want to stress the word complex, as it doesn’t seem like Gorillas feature no theory of mind, just some kind of more primitive version. It seems they model other minds as versions of their own minds in different states aided by mirror-neurons. Actually upon reflection it’s not so clear humans do it all that differently, seeing how prone we are to anthropomorphism. You know what I’m talking about if you gained new insights from “Three Worlds Collide”—it seems hard to conceive of nonhuman minds and sometimes you end up with real nonsense like King-Kong falling in love with a tiny female human because she has the “universally recognized property” called “beautiful”. Also I sometimes catch myself implicitly modeling other human minds in terms of “like me except for x, y, and z”.
So maybe the reason why Gorillas don’t ask questions isn’t really because they lack a theory of mind, but only that this theory of mind does not include the model of reality of that particular mind they try to model. They seem quite capable when it comes to modeling the emotional states and needs of other minds, but they just seem to lack the insight that those minds also contain different perspectives on reality. Maybe that is what the term consciousness should describe… being able to create a model of a mind other than your own including that mind having a different model of reality than your own. Yeah I think this is it...
This seems to me like a genuinely more useful definition of what consciousness is, because it includes distinguishing features of minds you could actually test with meaningful results as outcomes. At some point children start to riddle you with questions but for gorillas capable of sign language that point just doesn’t seem to arrive. The kinds of “questions” they ask are more along the lines of “Can I I get X” or maybe rather “I want you to give me permission to do X”.
Naturally not everyone can be happy with that definition because they really, really want to be able to say “my dog was unconscious when we visited the vet, but then it regained consciousness when it woke up”, but I submit usefulness should trump habits of speech. Also I can totally conceive of other minds putting forth even more detailed and useful definitions of what the term consciousness should describe, so define away.
Wow, thanks for your comments! I agree that this seems like a way forward in trying to see if the idea of consciousness is worth salvaging (the way being to look for useful features).
I’m starting to think that the concept of consciousness lives or dies by the validity of the concepts of ‘qualia’ or ‘sense of self’, of both of which I already have some suspicion. It looks possible to me that ‘sense of self’ is pretty much a confused way of referring to a thing being good at leveraging its control over itself to effect changes, plus some epiphenomenal leftovers (possibly qualia). It looks like maybe this is similar to what you’re getting at about self-modelling.
[Part 1]
I like this post, I also doubt there is much coherence let alone usefulness to be found in most of the currently prevailing concepts of what consciousness is.
I prefer to think of words and the definitions of those words as micro-models of reality that can be evaluated in terms of their usefulness, especially in building more complex models capable of predictions. As in your excellent example of gender, words and definitions basically just carve complex features of reality into manageable chunks at the cost of losing information—there is a trade-off and getting it right enhances the usefulness of words and the concepts behind them. In 99.9% of cases the concept of biological gender is perfectly applicable to everyday life and totally a “good enough” model of reality, as long as you have the insight that hermaphrodites are actually also a real thing. In a case where you have to deal with one, the correct reaction is to adopt a more complex model of reality instead of trying to fit a complex reality into a model that is designed to compress information into categories with some inevitable information loss. Biological gender is a really good high-level model of reality, because it draws its imaginary line in an area where very few exceptions actually exist in reality. It’s an especially sharp distinction if you think of biological gender as having testiclular/ovarial(?) tissue, but in very rare instances this model will still miss to encompass rare special cases where the complexity of reality defies your model. “Mental gender” seems to be a rather different yet in most cases more useful everyday concept of gender, because we usually care more about creating models of other peoples’ minds than about whether or not they have testicular or ovarian tissues—outside of curiosity or medical context. The lesson here is that your model of reality will always fall short no matter where you draw the line (at least at “higher levels” of reality, “low-level” models of atoms or particles are much more precise and unambiguous, than models of “higher-level” things like “persons” (When exactly does a fetus become a person?) or “societies” (are two people a society? How about eight?) and I’m quite sure any models of whatever “consciousness” is face the same problem).
In other words I think it’s better to think of models/maps in terms of their usefulness, not in terms of right or wrong. In my opinion the job of a model is to make something understandable and more predictable, the job of a model is not to “reflect reality as closely as possible”, especially of complex higher-level things. The “perfect model” in the latter sense would essentially be a perfect carbon copy of a real thing and tells you exactly as much -or as little- as the thing you’re trying to model already does anyway. The usefulness of a model lies in compacting information enough to become understandable while also predicting outcomes better than competing models.
If you accept that notion, the question really becomes how should the term consciousness be defined to be useful and describe / differentiate something we actually care about. So we’d like to represent a part of reality we care about in a way that compresses information while retaining a high level of usefulness—meaning we can understand it but without cutting away vital parts and ideally in terms of being able to make predictions if we were to integrate the concept of consciousness into models with the potential to predict outcomes.
So which part of reality should the term consciousness try to model in order to be useful? I find it highly problematic and close to maximum uselessness to think of consciousness as some kind of continuum on which we rank information processing in living things/agents. Some people actually really think of consciousness as rocks having 0, bacteria having perhaps 0.001, bees having 0.01, rats having 0.1, dogs maybe 0.2, humans perhaps 0.5. Maximum uselessness I would argue as it tells you nothing. Why not just substitute consciousness with some notion of “maximum calculations per second” then and reserve the term consciousness for something we actually care about instead of wasting such a nice word on something we don’t really care so much about—and more importantly on something we can already express with other words and concepts like “information processing”.
What’s funny about consciousness is that no one really agrees what exactly the definition should be but somehow everyone agrees that it’s really really important. Why do we care so much about something we seemingly know close to nothing about? Seriously though why do we?
Look at all those hilarious “quantum consciousness” or “become more conscious” concepts peddled by the self-help industry complex. Possessing consciousness seems really high status nowadays, unlike say… all those lousy low-status life forms like frogs and bees and mice. The idea that somehow you can improve your consciousness seems very appealing, because if insects and birds have little or none of that thing called consciousness, and people surely have some of that thing called consciousness, then logically if I can get more of that awesome “consciousness” than my neighbor I’m superior than him in just the same way I’m superior to a frog. Really, self-help opened my eyes to how unconsciously I lived my life once and nowadays I feel strongly about helping all those low-consciousness people realize their full potential and I do my best to help them become more conscious beings...
It sounds ridiculous but couldn’t that be part of why consciousness is so damn important to us even if he have no clue what exactly it is? I may have no idea what that consciousness is but somehow I really insist that I have it, I mean if everyone else says s/he has it I surely have it too, can’t be left out. Whatever consciousness is, we usually agree that bacteria doesn’t have it and we do so it must be important if some kinds of life have it and others don’t.
Okay let’s get serious again. What distinct features of minds exactly do we actually explicitly (and perhaps implicitly) care about, when we usually attempt employ that murky concept of consciousness? Hmm… well if we care about it, we might gain insight into what exactly it is we care about by thinking about which specific situations make us choose to employ that word and maybe from there we can distill why we seem to care so much.
Well whatever consciousness means, most people agree the concept of awareness seems highly related or somehow relevant to it. Consciousness is often used as a synonym for self-awareness, but what on earth is that exactly? (And why would we ever need two words for the exact same thing?) For some people it means having “internal experiences”, for others “being aware of having internal experiences”, which doesn’t quite seem to be the same thing from where I stand… but where do these intuitions about something I seemingly know nothing about come from? Probably personal experiences...
Sometimes I read a paragraph and my mind stars wandering and daydreaming until I snap out of it and think to myself “Jesus I was totally gone for a second where was I again?” I realize my eyes are at the bottom of the paragraph already and it seems like I semi-remember that they kept wandering over the letters and words as if I was actually reading them… without being aware. And moreover my mind reawakening seems to have been triggered by arriving at the end of that paragraph and going “now what?”, seemingly out of habit because I usually stop at the end of a paragraph and consider if I actually “got” what I read there. And sure enough upon rereading the paragraph, it seems very familiar to me… but I was not at all sure whether I read it or not just a few seconds ago, and I’d say whatever the word “self-aware” means shouldn’t really include that experience (or non-experience?) I just described. But did I lack “awareness” or just “self-awareness” in that example? Hmm...
[Part 2]
If I drive a car (especially on known routes) my “auto-pilot” takes over sometimes. I stop at a red light but my mind is primarily focused on visually modeling the buttocks of my girlfriend in various undergarments or none at all. Am I actually “aware” of having stopped at the red light? Probably I was as much”aware” of the red light as a cheetah is aware of eating the carcass of a gazelle. Interestingly my mind seems capable of visually modeling buttocks in my mind’s eye and reading real visual cues like red lights and habitually react to them—all at the same time. It seems I was more aware of my internal visual modeling than of the external visual cue however. In a sense I was aware of both, yet I’m not sure I was “self-aware” at any point, because whatever that means I feel like being self-aware in that situation would actually result in me going “Jesus I should pay more attention to driving, I can still enjoy that buttocks in real life once I actually managed to arrive home unharmed”.
So what’s self-awareness then? I suppose I use that term to mean something roughly like: “thoughts that include a model of myself while modeling a part of reality on-the-fly based on current sensual input”. If my mind is predominantly preoccupied with “daydreaming” aka. creating and playing with a visual or sensual model that is based on manipulating memories rather than real sensual inputs, I don’t feel like the term “self-awareness” should apply here even if that daydreaming encompasses a mental model of myself slapping a booty or whatever.
That’s surely still quite ill-defined and far from maximum usefulness but whenever I’m tempted to use the word self-aware I seem to roughly think of something like that definition. So if we were to use “consciousness” as a synonym for self-awareness (which I’m not a fan of, but quite some people seem to be), maybe my attempt at a definition is a start to get toward something more useful and includes at least some of the “mental features” we seem to care about like “model of oneself” and “interpreting sensory input to create a model of reality”.
The problem is that rats can construct models of reality as well, and these models outlive sensual inputs as well, which is pretty clear from experiments that put rats in mazes. They are stuck for some time in that maze without any exit and any rewards present but during that time they learn the layout of that maze even if it’s empty and even though they are not externally rewarded for doing so. Once you drop a treat in that maze the rats who were able to wander around the maze beforehand know exactly how to get there as fast as possible, while rats new to that particular maze do not (“cognitive revolution” in psychology). Presumably their rat-mind also features some kind of model of themselves, presumably one that mainly features their body not so much their mind.
So to make the concept of self-awareness and perhaps consciousness more useful maybe what we really care about in the end is a mind being able to feature a model of its own mind (and thus what we call “ourselves”).
This is quite interesting… young children and for example gorillas who were taught to communicate in sign language seem to lack a fully developed “theory of mind”. Meaning it seems they can’t conceive of the possibility that other minds contain things theirs does not… well kind of. If they do model other minds, they seem to model them a lot like copies of their own mind, or perhaps just slightly altered copies. Gorillas that can communicate in sign language are perfectly capable of answering questions about i.e. their mood… implying self-awareness that goes somewhat beyond just recognizing their physical reflection in a mirror but also being aware of their own feelings aka. internal experiences. But they never ever seem to get the brilliant idea of asking you a question, presumably because they can’t conceive of the possibility that you know something they don’t. Perhaps here we can draw a sensible line that differentiates between the terms self-awareness and consciousness, where the latter includes the ability to make complex models of the models contained in minds other than your own. I want to stress the word complex, as it doesn’t seem like Gorillas feature no theory of mind, just some kind of more primitive version. It seems they model other minds as versions of their own minds in different states aided by mirror-neurons. Actually upon reflection it’s not so clear humans do it all that differently, seeing how prone we are to anthropomorphism. You know what I’m talking about if you gained new insights from “Three Worlds Collide”—it seems hard to conceive of nonhuman minds and sometimes you end up with real nonsense like King-Kong falling in love with a tiny female human because she has the “universally recognized property” called “beautiful”. Also I sometimes catch myself implicitly modeling other human minds in terms of “like me except for x, y, and z”.
So maybe the reason why Gorillas don’t ask questions isn’t really because they lack a theory of mind, but only that this theory of mind does not include the model of reality of that particular mind they try to model. They seem quite capable when it comes to modeling the emotional states and needs of other minds, but they just seem to lack the insight that those minds also contain different perspectives on reality. Maybe that is what the term consciousness should describe… being able to create a model of a mind other than your own including that mind having a different model of reality than your own. Yeah I think this is it...
This seems to me like a genuinely more useful definition of what consciousness is, because it includes distinguishing features of minds you could actually test with meaningful results as outcomes. At some point children start to riddle you with questions but for gorillas capable of sign language that point just doesn’t seem to arrive. The kinds of “questions” they ask are more along the lines of “Can I I get X” or maybe rather “I want you to give me permission to do X”.
Naturally not everyone can be happy with that definition because they really, really want to be able to say “my dog was unconscious when we visited the vet, but then it regained consciousness when it woke up”, but I submit usefulness should trump habits of speech. Also I can totally conceive of other minds putting forth even more detailed and useful definitions of what the term consciousness should describe, so define away.
Wow, thanks for your comments! I agree that this seems like a way forward in trying to see if the idea of consciousness is worth salvaging (the way being to look for useful features).
I’m starting to think that the concept of consciousness lives or dies by the validity of the concepts of ‘qualia’ or ‘sense of self’, of both of which I already have some suspicion. It looks possible to me that ‘sense of self’ is pretty much a confused way of referring to a thing being good at leveraging its control over itself to effect changes, plus some epiphenomenal leftovers (possibly qualia). It looks like maybe this is similar to what you’re getting at about self-modelling.