What exactly makes both me and EY and presumably many others think sentience is a thing and distinguish “sentient” and “non-sentient”?
Wait, is “sentient” actually a thing ? I always thought that it was just a shorthand we use for describing a wide gamut of phenomena. Humans are quite sentient, chimps less so, dogs even less so, our current AIs even less sentient than that, and rocks aren’t sentient at all. Am I wrong about this ?
I fail to see how to distinguish a “sentient” super-intelligence for a “non-sentient” one.
Well, you could ask it whether it has subjective experience and trust its self-report. That’s basically the same strategy we use for other intelligences, after all.
What do you mean by “black box” ? If the AI (or alien or uplifted dolphin or whatever) tells me that it has subjective experiences, why shouldn’t I take it at its word ?
Oh, I am not denying that they exist, just saying I don’t know a solid theory of subjective experience. I think there was something about Bayesian {Predictive world model, planning engine, utility function, magic AI algorithm} AIs would not have philosophy.
I think there was something about Bayesian {Predictive world model, planning engine, utility function, magic AI algorithm} AIs would not have philosophy.
Sorry, I have trouble parsing this sentence. But in general, I don’t think we need a detailed theory of subjective experiences (assuming that it even makes sense to conceive of such a theory) in order to determine whether some entity is sentient—as long as that entity is also sapient, and capable of communication. If that’s the case, then we can just ask it, and trust its word. If that’s not the case, then I agree, we have a problem.
Is “sentient” a computational property or reducible to “why does my brain make me think it.”
I’m not entirely sure what “why does my brain make me think it” means, but I’ve just noticed that I incorrectly used the word “sentient” in its science-fictional sense; I should’ve said something like “sapient”, instead. The word sentient is often incorrectly used (f.ex. by me) to mean “capable of rational thought and communication”, whereas the more correct definition is “capable of having subjective experiences”.
As luck would have it, my previous comment applies to both meanings of the word, but still, they are distinct (though probably related). I apologize for the confusion.
Wait, is “sentient” actually a thing ? I always thought that it was just a shorthand we use for describing a wide gamut of phenomena. Humans are quite sentient, chimps less so, dogs even less so, our current AIs even less sentient than that, and rocks aren’t sentient at all. Am I wrong about this ?
That is what I try to discern: Is “sentient” a computational property or reducible to “why does my brain make me think it.”
I agree with your statement, but I fail to see how to distinguish a “sentient” super-intelligence for a “non-sentient” one.
In general I am confused.
Well, you could ask it whether it has subjective experience and trust its self-report. That’s basically the same strategy we use for other intelligences, after all.
And we return to the back box of subjective experience.
What do you mean by “black box” ? If the AI (or alien or uplifted dolphin or whatever) tells me that it has subjective experiences, why shouldn’t I take it at its word ?
Oh, I am not denying that they exist, just saying I don’t know a solid theory of subjective experience. I think there was something about Bayesian {Predictive world model, planning engine, utility function, magic AI algorithm} AIs would not have philosophy.
Sorry, I have trouble parsing this sentence. But in general, I don’t think we need a detailed theory of subjective experiences (assuming that it even makes sense to conceive of such a theory) in order to determine whether some entity is sentient—as long as that entity is also sapient, and capable of communication. If that’s the case, then we can just ask it, and trust its word. If that’s not the case, then I agree, we have a problem.
I’m not entirely sure what “why does my brain make me think it” means, but I’ve just noticed that I incorrectly used the word “sentient” in its science-fictional sense; I should’ve said something like “sapient”, instead. The word sentient is often incorrectly used (f.ex. by me) to mean “capable of rational thought and communication”, whereas the more correct definition is “capable of having subjective experiences”.
As luck would have it, my previous comment applies to both meanings of the word, but still, they are distinct (though probably related). I apologize for the confusion.