My own presumption regarding sentience and intelligence is that it’s possible to have one without the other (I don’t think they are unrelated, but I think it’s possible for systems to be extremely capable but still not sentient).
I think it can be easy to underestimate how different other possible minds may be from ourselves (and other animals). We have evolved a survival instinct, and evolved an instinct to not want to be dominated. But I don’t think any intelligent mind would need to have those instincts.
To me it seems that thinking machines don’t need feelings in order to be able to think (similarily to how it’s possible for minds to be able to hear but not see, and visa versa). Some things relating to intelligence are of such a kind that you can’t have one without the other, but I don’t think that is the case for the kinds of feelings/instincts/inclinations you mention.
My own presumption regarding sentience and intelligence is that it’s possible to have one without the other (I don’t think they are unrelated, but I think it’s possible for systems to be extremely capable but still not sentient).
I think it can be easy to underestimate how different other possible minds may be from ourselves (and other animals). We have evolved a survival instinct, and evolved an instinct to not want to be dominated. But I don’t think any intelligent mind would need to have those instincts.
To me it seems that thinking machines don’t need feelings in order to be able to think (similarily to how it’s possible for minds to be able to hear but not see, and visa versa). Some things relating to intelligence are of such a kind that you can’t have one without the other, but I don’t think that is the case for the kinds of feelings/instincts/inclinations you mention.
That being said, I do believe in instrumental convergence.
Below are some posts you may or may not find interesting :)
Ghosts in the Machine
The Design Space of Minds-In-General
Humans in Funny Suits
Mind Projection Fallacy