I think I’m now (leaning towards being) a ‘panpsyhcist’ but in terms of equating information processing with sentience, not ‘consciousness’.
‘Consciousness’ is ‘being a storyteller’.
A (VERY rough) measure of ‘sentience’ tho is ‘what kind of stories can we tell about what it’s like to be a particular thing’. The ‘story’ of a photon, or even a rock, is going to be much simpler than the same thing for a bacterium or a human. (It’s not obvious tho that we might ‘miss’ some, or even most, ‘sentience’ because it’s so alien to our own experiences and understanding.)
I don’t think ‘consciousness’ can exist without ‘temporal memory’ and ‘language’ (i.e. an ability to communicate, even if only with oneself).
So, along these lines, non-human primates/apes probably are somewhat conscious. They can tell ‘lies’ for one. Evidence for their consciousness being more limited than our own is that the ‘stories they tell’ are much simpler than ours.
But I think one nice feature of these ideas is that it seems like we could in fact discern some facts about this for particular entities (via ‘external observation’), e.g. test whether they do have ‘temporal memory’ or language (both evidence of ‘consciousness’) or whether they have ‘experiences’ and respond to features of their environment (which would be evidence of ‘sentience’).
I’m with you on information being key to all of this.
We’re using words differently. When I say “consciousness”, I mean “there being something which it is like to be a thing”, which I think you mean by “sentience.” What I would call the thing you call “consciousness” is either “sapience” or “sophonce”, depending on whether you consider self-awareness and agency an important part of the definition (sophonce) or not (sapience). The difference is that I expect tool / non-agentive AGIs to be sapient (in that they can think abstractly), but not sophont (they are not “people” who have an identity and will of their own).
“There being something which it is like to be this thing” is a characteristic I consider likely to be possessed by all systems which are “processing information” in some way, though I am unsure exactly what that means. It certainly seems to be something all living things possess, and possibly some nonliving ones—for instance, it has recently been demonstrated that neural networks can be simulated using the acoustics of a vibrating metal sheet (I read about it in Quanta, I’ll give you a link if you want but it shouldn’t be hard to find), meaning that for the duration that they are being used this way, they (or rather, the algorithm they are running) are as conscious as the network being simulated would be.
I think that photons do not have this characteristic—they are not sentient in your terms—because they aren’t interacting with anything or doing anything as they float through the aether. Possibly sparks of qualia exist when they interact with matter? But I don’t know the physical details of such interactions or whether it is possible for a photon to remain “the same photon” after interacting with other matter—I expect that it is a meaningless concept—so there would be no continuity of consciousness / sentience whatsoever there—nothing remotely resembling a being that is having the experience.
A rock on the other hand maybe would have a rather continuous experience, albeit unimaginable to us and extremely simple, due to the coherence of its physical form and the fact that acoustic vibrations (which maybe are a kind of computation or information processing, in a sense?) are traversing it. But if information processing means something more than that, then rocks wouldn’t either. I’m really not sure about all that.
Yeah, my current personal definitions/models of ‘consciousness’ and ‘sentience’ might be very idiosyncratic. They’re in flux as of late!
I think photons are more complicated than you think! But I also don’t think their ‘sentience’ is meaningfully different than from a rock. A rock is much bigger tho, and you could tell ‘stories’ of all the parts of which it’s made, and I would expect those to be similar to the ‘stories’ you could tell about a photon; maybe a little more complicated. But they still feel like they have the same ‘aleph number’ in ‘story terms’ somehow.
I think what separates rocks from even viruses, let alone bacteria, is that the ‘stories’ you could tell about a rock are necessarily and qualitatively simpler, i.e. composed of purely ‘local’ stories, or for almost all parts. The rock itself tho is, kinda, in some sense, a memory of its history, e.g. ‘geological weathering’.
It’s hard to know, in principle, whether we might be missing ‘stories’, or sentience/consciousness/qualia, because those stories are much slower than ours (or maybe much faster too).
Viruses, bacteria, and even probably the simplest possible ‘life prototypes’ all seem like they’re, fundamentally, ‘just’ a kind of memory, i.e. history.
‘Rocks’ have stories composed of statis, simple fixed reactions, and maybe sometimes kinds of ‘nested explosions’ (e.g. being broken apart into many pieces).
‘Life’ has a history – it IS a history, copying itself into the future. It’s on ‘another level’ (‘of ontology’? of ‘our ontology’?) because it’s the kid of thing capable of ‘universal computation’ based on molecular dynamics.
We’re a kind of life capable of cognitive (and cultural) ‘universal computation’ – there might be a LOT of this kind of life beyond our own species; maybe most things with something like a brain or even just a nervous system? Humans do seem ‘programmable’ in a way that almost everything else seems ‘hard-coded’. Some other animals do seem to have some limited programmability; maybe not ‘universal programmability’ tho?
I think we (humans) are, or many of us anyways, ‘universally programmable’ via, e.g. culture, or (‘conscious’) ‘reasoning’.
A definition of ‘sentient’:
able to perceive or feel things.
‘Perceive’ seems relatively ‘sharp’ for us; and this site generally. (Do you disagree?)
‘Feel’ I think could be helpfully sharpened to ‘can react to specific things’. I think it’s sensible to think that we can ‘feel’ things that we’re not consciously aware of. (We probably can’t consciously report those feelings tho! Or not easily.)
Tho maybe ‘feel’ reasonably requires ‘consciousness’, i.e. the ability to tell a story about feelings, i.e. tell a story about our internal subjective experience.
I think it makes sense for us to tell a story about a, e.g. lion, being ‘scared’ or ‘afraid’ while being attacked by a pack of wild dogs. I’m less sure that it makes sense to think that the lion itself ‘feels’ those emotions? What do we, what could we, mean by that?
I don’t think there’s any evidence that lions can themselves ‘tell a story’ that mentions that fear. I wouldn’t guess that there’s anything going on in the lion’s brain/mind that’s at all like telling itself “Oh shit oh shit oh shit”. I don’t think there’s anything remembering the fear, and its context, or any of the relevant associations. I don’t think it’s just that we can’t know whether the lion ‘feels’, from ‘the outside’, because the lion can’t communicate with us. I don’t think the lion can communicate the fact of its specific fear to other lions either. Nor do I think it can ‘tell itself’ of that specific fear.
It ‘feels’ the fear, but afterwards that fear is ‘gone’. (The effects of that fear can of course persist, even in its brain/mind. But there’s no ‘pointer’ to or memory of that specific fear anywhere in particular.)
I think consciousness is basically ‘being able to tell a story’ and it’s just a kind of ‘convergent inevitability’ that we should generally expect consciousness to also entail/imply ‘being able to tell a story about itself’, and thus also ‘know of itself as a thing in the World’.
I think I’m now (leaning towards being) a ‘panpsyhcist’ but in terms of equating information processing with sentience, not ‘consciousness’.
‘Consciousness’ is ‘being a storyteller’.
A (VERY rough) measure of ‘sentience’ tho is ‘what kind of stories can we tell about what it’s like to be a particular thing’. The ‘story’ of a photon, or even a rock, is going to be much simpler than the same thing for a bacterium or a human. (It’s not obvious tho that we might ‘miss’ some, or even most, ‘sentience’ because it’s so alien to our own experiences and understanding.)
I don’t think ‘consciousness’ can exist without ‘temporal memory’ and ‘language’ (i.e. an ability to communicate, even if only with oneself).
So, along these lines, non-human primates/apes probably are somewhat conscious. They can tell ‘lies’ for one. Evidence for their consciousness being more limited than our own is that the ‘stories they tell’ are much simpler than ours.
But I think one nice feature of these ideas is that it seems like we could in fact discern some facts about this for particular entities (via ‘external observation’), e.g. test whether they do have ‘temporal memory’ or language (both evidence of ‘consciousness’) or whether they have ‘experiences’ and respond to features of their environment (which would be evidence of ‘sentience’).
I’m with you on information being key to all of this.
Inspiration:
(4) Stephen Wolfram: Complexity and the Fabric of Reality | Lex Fridman Podcast #234 - YouTube
We’re using words differently. When I say “consciousness”, I mean “there being something which it is like to be a thing”, which I think you mean by “sentience.” What I would call the thing you call “consciousness” is either “sapience” or “sophonce”, depending on whether you consider self-awareness and agency an important part of the definition (sophonce) or not (sapience). The difference is that I expect tool / non-agentive AGIs to be sapient (in that they can think abstractly), but not sophont (they are not “people” who have an identity and will of their own).
“There being something which it is like to be this thing” is a characteristic I consider likely to be possessed by all systems which are “processing information” in some way, though I am unsure exactly what that means. It certainly seems to be something all living things possess, and possibly some nonliving ones—for instance, it has recently been demonstrated that neural networks can be simulated using the acoustics of a vibrating metal sheet (I read about it in Quanta, I’ll give you a link if you want but it shouldn’t be hard to find), meaning that for the duration that they are being used this way, they (or rather, the algorithm they are running) are as conscious as the network being simulated would be.
I think that photons do not have this characteristic—they are not sentient in your terms—because they aren’t interacting with anything or doing anything as they float through the aether. Possibly sparks of qualia exist when they interact with matter? But I don’t know the physical details of such interactions or whether it is possible for a photon to remain “the same photon” after interacting with other matter—I expect that it is a meaningless concept—so there would be no continuity of consciousness / sentience whatsoever there—nothing remotely resembling a being that is having the experience.
A rock on the other hand maybe would have a rather continuous experience, albeit unimaginable to us and extremely simple, due to the coherence of its physical form and the fact that acoustic vibrations (which maybe are a kind of computation or information processing, in a sense?) are traversing it. But if information processing means something more than that, then rocks wouldn’t either. I’m really not sure about all that.
Yeah, my current personal definitions/models of ‘consciousness’ and ‘sentience’ might be very idiosyncratic. They’re in flux as of late!
I think photons are more complicated than you think! But I also don’t think their ‘sentience’ is meaningfully different than from a rock. A rock is much bigger tho, and you could tell ‘stories’ of all the parts of which it’s made, and I would expect those to be similar to the ‘stories’ you could tell about a photon; maybe a little more complicated. But they still feel like they have the same ‘aleph number’ in ‘story terms’ somehow.
I think what separates rocks from even viruses, let alone bacteria, is that the ‘stories’ you could tell about a rock are necessarily and qualitatively simpler, i.e. composed of purely ‘local’ stories, or for almost all parts. The rock itself tho is, kinda, in some sense, a memory of its history, e.g. ‘geological weathering’.
It’s hard to know, in principle, whether we might be missing ‘stories’, or sentience/consciousness/qualia, because those stories are much slower than ours (or maybe much faster too).
Viruses, bacteria, and even probably the simplest possible ‘life prototypes’ all seem like they’re, fundamentally, ‘just’ a kind of memory, i.e. history.
‘Rocks’ have stories composed of statis, simple fixed reactions, and maybe sometimes kinds of ‘nested explosions’ (e.g. being broken apart into many pieces).
‘Life’ has a history – it IS a history, copying itself into the future. It’s on ‘another level’ (‘of ontology’? of ‘our ontology’?) because it’s the kid of thing capable of ‘universal computation’ based on molecular dynamics.
We’re a kind of life capable of cognitive (and cultural) ‘universal computation’ – there might be a LOT of this kind of life beyond our own species; maybe most things with something like a brain or even just a nervous system? Humans do seem ‘programmable’ in a way that almost everything else seems ‘hard-coded’. Some other animals do seem to have some limited programmability; maybe not ‘universal programmability’ tho?
I think we (humans) are, or many of us anyways, ‘universally programmable’ via, e.g. culture, or (‘conscious’) ‘reasoning’.
A definition of ‘sentient’:
‘Perceive’ seems relatively ‘sharp’ for us; and this site generally. (Do you disagree?)
‘Feel’ I think could be helpfully sharpened to ‘can react to specific things’. I think it’s sensible to think that we can ‘feel’ things that we’re not consciously aware of. (We probably can’t consciously report those feelings tho! Or not easily.)
Tho maybe ‘feel’ reasonably requires ‘consciousness’, i.e. the ability to tell a story about feelings, i.e. tell a story about our internal subjective experience.
I think it makes sense for us to tell a story about a, e.g. lion, being ‘scared’ or ‘afraid’ while being attacked by a pack of wild dogs. I’m less sure that it makes sense to think that the lion itself ‘feels’ those emotions? What do we, what could we, mean by that?
I don’t think there’s any evidence that lions can themselves ‘tell a story’ that mentions that fear. I wouldn’t guess that there’s anything going on in the lion’s brain/mind that’s at all like telling itself “Oh shit oh shit oh shit”. I don’t think there’s anything remembering the fear, and its context, or any of the relevant associations. I don’t think it’s just that we can’t know whether the lion ‘feels’, from ‘the outside’, because the lion can’t communicate with us. I don’t think the lion can communicate the fact of its specific fear to other lions either. Nor do I think it can ‘tell itself’ of that specific fear.
It ‘feels’ the fear, but afterwards that fear is ‘gone’. (The effects of that fear can of course persist, even in its brain/mind. But there’s no ‘pointer’ to or memory of that specific fear anywhere in particular.)
I think consciousness is basically ‘being able to tell a story’ and it’s just a kind of ‘convergent inevitability’ that we should generally expect consciousness to also entail/imply ‘being able to tell a story about itself’, and thus also ‘know of itself as a thing in the World’.