Consciousness, whatever it may be—a substance, a process, a name for a confusion—is not epiphenomenal; your mind can catch the inner listener in the act of listening, and say so out loud. The fact that I have typed this paragraph would at least seem to refute the idea that consciousness has no experimentally detectable consequences.
Eliezer, I’m shocked to see you write such nonsense. This only shows that you don’t understand the zombie hypothesis at all. Or, you suppose that intelligence requires consciousness. This is the spiritualist, Searlian stuff you usually oppose.
The zombie hypothesis begins by asserting that I have no way of knowing whether you are conscious, no matter what you write. You of all people I expect to accept this, since you believe that you are Turing-computable. You haven’t made an argument against the zombie hypothesis; you’ve merely asserted that it is false and called that assertion an argument.
The only thing I can imagine is that you have flipped the spiritualist argument around to its mirror image. Instead of saying that “I am conscious; Turing machines may not be conscious; therefore I am not just a Turing machine”, you may be saying, “I am conscious; I am a Turing machine; therefore, all Turing machines that emit this sequence of symbols are conscious.”
The zombie hypothesis begins by asserting that I have no way of knowing whether you are conscious, no matter what you write. You of all people I expect to accept this, since you believe that you are Turing-computable. You haven’t made an argument against the zombie hypothesis; you’ve merely asserted that it is false and called that assertion an argument.
The only thing I can imagine is that you have flipped the spiritualist argument around to its mirror image. Instead of saying that “I am conscious; Turing machines may not be conscious; therefore I am not just a Turing machine”, you may be saying, “I am conscious; I am a Turing machine; therefore, all Turing machines that emit this sequence of symbols are conscious.”