One argument I’ve encountered is that sentient creatures are precisely those creatures that we can form cooperative agreements with. (Counter-argument: one might think that e.g. the relationship with a pet is also a cooperative one [perhaps more obviously if you train them to do something important, and you feed them], while also thinking that pets aren’t sentient.)
Another is that some people’s approach to the Prisoner’s Dilemma is to decide “Anyone who’s sufficiently similar to me can be expected to make the same choice as me, and it’s best for all of us if we cooperate, so I’ll cooperate when encountering them”; and some of them may figure that sentience alone is sufficient similarity.
We need better, more specific terms to break up the horrible mishmash of correlated-but-not-truly-identical ideas bound up in words like consciousness and sentient.
At the very least, let’s distinguish sentient vs sapient. All animals all sentient, only smart ones are sapient (maybe only humans depending on how strict your definition).
Some other terms needing disambiguation…
Current LLMs are very knowledgeable, somewhat but not very intelligent, somewhat creative, and lack coherence. They have some self-awareness but it seems to lack some aspects that animals have around “feeling self state”, but some researchers are working on adding these aspects to experimental architectures.
What a mess our words based on observing ourselves make of trying to divide reality at the joints when we try to analyze non-human entities like animals and AI!
One argument I’ve encountered is that sentient creatures are precisely those creatures that we can form cooperative agreements with. (Counter-argument: one might think that e.g. the relationship with a pet is also a cooperative one [perhaps more obviously if you train them to do something important, and you feed them], while also thinking that pets aren’t sentient.)
Another is that some people’s approach to the Prisoner’s Dilemma is to decide “Anyone who’s sufficiently similar to me can be expected to make the same choice as me, and it’s best for all of us if we cooperate, so I’ll cooperate when encountering them”; and some of them may figure that sentience alone is sufficient similarity.
We need better, more specific terms to break up the horrible mishmash of correlated-but-not-truly-identical ideas bound up in words like consciousness and sentient.
At the very least, let’s distinguish sentient vs sapient. All animals all sentient, only smart ones are sapient (maybe only humans depending on how strict your definition).
https://english.stackexchange.com/questions/594810/is-there-a-word-meaning-both-sentient-and-sapient
Some other terms needing disambiguation… Current LLMs are very knowledgeable, somewhat but not very intelligent, somewhat creative, and lack coherence. They have some self-awareness but it seems to lack some aspects that animals have around “feeling self state”, but some researchers are working on adding these aspects to experimental architectures.
What a mess our words based on observing ourselves make of trying to divide reality at the joints when we try to analyze non-human entities like animals and AI!