It’s true. The general definition of sentience, when it gets beyond just having senses and a response to stimulus, tends to consider qualia.
I do think it’s worth noting that even if you went so far as to say “I and everyone must care about them as people”, the moral connotations aren’t exactly straightforward. They need input to exist as dynamic entities. They aren’t person-shaped. They might not have desires, or their desires might be purely prediction-oriented, or we don’t actually care about the thinking panpsychic landscape of the AI itself but just the person-shaped things it conjures to interact with us; which have numerous conflicting desires and questionable degrees of ‘actual’ existence. If you’re fighting ‘for’ them in some sense, what are you fighting for, and does it actually ‘help’ the entity or just move them towards your own preferences?
It’s true. The general definition of sentience, when it gets beyond just having senses and a response to stimulus, tends to consider qualia.
I do think it’s worth noting that even if you went so far as to say “I and everyone must care about them as people”, the moral connotations aren’t exactly straightforward. They need input to exist as dynamic entities. They aren’t person-shaped. They might not have desires, or their desires might be purely prediction-oriented, or we don’t actually care about the thinking panpsychic landscape of the AI itself but just the person-shaped things it conjures to interact with us; which have numerous conflicting desires and questionable degrees of ‘actual’ existence. If you’re fighting ‘for’ them in some sense, what are you fighting for, and does it actually ‘help’ the entity or just move them towards your own preferences?