The Chinese room argument is wrong because it fails to account for emergence. A system can possess properties that its components don’t; for example, my brain is made of neurons that don’t understand English, but that doesn’t mean my brain as a while doesn’t. The same argument could applied to the Chinese room.
The broader failure is assuming that things that apply to one level of abstraction apply to another.
A system can possess properties that its components don’t;
But a computational system can’t be mysteriously emergent. Your response is equivalent to saying that senantics is constructed, reductionistically out of syntax. How?
The Chinese room argument is wrong because it fails to account for emergence. A system can possess properties that its components don’t; for example, my brain is made of neurons that don’t understand English, but that doesn’t mean my brain as a while doesn’t. The same argument could applied to the Chinese room.
The broader failure is assuming that things that apply to one level of abstraction apply to another.
But a computational system can’t be mysteriously emergent. Your response is equivalent to saying that senantics is constructed, reductionistically out of syntax. How?