Well, when you put it that way, I guess I consider it a respectable argument, myself.
That is, it’s a useful exercise for starting to think rigorously about what it means to be a mind. That’s what thought experiments are for, after all, to make you think about things you might not have thought about otherwise. That function deserves respect.
If you decide the Chinese Box really does understand Chinese, that implies certain things about the nature of understanding. If you decide the Chinese Box simply can’t exist at all, that implies other things. If you decide it could understand Chinese if only X or Y, ditto. If you decide that neither the Chinese Box nor any other system is actually capable of understanding Chinese, ibid.
But Searle really does seem to believe that it provides a reason to conclude one way over another, and that seems downright bizarre to me.
Unfortunately, yes. (Or if not compelling, at least respectable.)
Well, when you put it that way, I guess I consider it a respectable argument, myself.
That is, it’s a useful exercise for starting to think rigorously about what it means to be a mind. That’s what thought experiments are for, after all, to make you think about things you might not have thought about otherwise. That function deserves respect.
If you decide the Chinese Box really does understand Chinese, that implies certain things about the nature of understanding. If you decide the Chinese Box simply can’t exist at all, that implies other things. If you decide it could understand Chinese if only X or Y, ditto. If you decide that neither the Chinese Box nor any other system is actually capable of understanding Chinese, ibid.
But Searle really does seem to believe that it provides a reason to conclude one way over another, and that seems downright bizarre to me.