To think that the book has sentience sounds to me like a statement of magical thinking, not of physicalism.
I’m pretty sure this is because you’re defining “sentience” as some extra-physical property possessed by the algorithm, something with physicalism explicitly rejects.
Consciousness isn’t something that arises when algorithms compute complex social games. Consciousness is when some algorithm computes complex physical games. (under a purely physical theory of consciousness such as EY’s).
To understand how physicalism can talk about metaphysical categories, consider numbers. Some physical systems have the property of being “two of something” as understood by human beings. Two sheep standing in a field, for example. Or two rocks piled on of one another. There’s no magical thing that happens when “two” of something come into existence. They don’t suddenly send a glimmer of two-ness off into a pure platonic realm of numbers. They simply are “two”, and what makes them “two” is that being “two of something” is a category readily recognized by human beings (and presumably other intelligent beings).
Similarly, a physicalist theory of consciousness defines certain physical systems as conscious if they meet certain criteria. Specifically for EY, these criteria are self-recognition and complex social games. It matters no more whether they are implemented by a Chinese room or a computer or a bunch of meat. What matters is that they implement a particular algorithm.
When confronted with the Chinese-room consciousness, EY might say something like: “I recognize that this system is capable of self reflection and social reasoning in much the same way that I am, therefore I recognize that it is conscious in much the same way as I am.”
I’m pretty sure this is because you’re defining “sentience” as some extra-physical property possessed by the algorithm, something with physicalism explicitly rejects.
Consciousness isn’t something that arises when algorithms compute complex social games. Consciousness is when some algorithm computes complex physical games. (under a purely physical theory of consciousness such as EY’s).
To understand how physicalism can talk about metaphysical categories, consider numbers. Some physical systems have the property of being “two of something” as understood by human beings. Two sheep standing in a field, for example. Or two rocks piled on of one another. There’s no magical thing that happens when “two” of something come into existence. They don’t suddenly send a glimmer of two-ness off into a pure platonic realm of numbers. They simply are “two”, and what makes them “two” is that being “two of something” is a category readily recognized by human beings (and presumably other intelligent beings).
Similarly, a physicalist theory of consciousness defines certain physical systems as conscious if they meet certain criteria. Specifically for EY, these criteria are self-recognition and complex social games. It matters no more whether they are implemented by a Chinese room or a computer or a bunch of meat. What matters is that they implement a particular algorithm.
When confronted with the Chinese-room consciousness, EY might say something like: “I recognize that this system is capable of self reflection and social reasoning in much the same way that I am, therefore I recognize that it is conscious in much the same way as I am.”