Z.M. Davis, “I am consciously aware that 2 and 2 make 4” is not a different claim from “I am aware that 2 and 2 make 4.” One can’t make one claim without making the other. In other words, “I am unconsciously aware that 2 and 2 make 4” is a contradiction in terms.
If an AI were unconscious, it presumably would be a follower of Daniel Dennett; i.e. it would admit that it had no qualia, but would say that the same was true of human beings. But then it would say that it is conscious in the same sense that human beings are. Likewise, if it were conscious, it would say it was conscious. So it would say it was conscious whether it was or not.
I agree in principle that there could be an unconscious chatbot that could pass the Turing test; but it wouldn’t be superintelligent.
If it says I have no qualia, it’s wrong. Headaches fucking HURT, dammit. That’s a quale.
And here’s the point you seem to be missing: Yes, it can make the statement; but no, that does not mean it actually has the required capacities to make the statement true. It’s trivially easy to write a computer program that prints out all manner of statements.
Z.M. Davis, “I am consciously aware that 2 and 2 make 4” is not a different claim from “I am aware that 2 and 2 make 4.” One can’t make one claim without making the other. In other words, “I am unconsciously aware that 2 and 2 make 4” is a contradiction in terms.
If an AI were unconscious, it presumably would be a follower of Daniel Dennett; i.e. it would admit that it had no qualia, but would say that the same was true of human beings. But then it would say that it is conscious in the same sense that human beings are. Likewise, if it were conscious, it would say it was conscious. So it would say it was conscious whether it was or not.
I agree in principle that there could be an unconscious chatbot that could pass the Turing test; but it wouldn’t be superintelligent.
If it says I have no qualia, it’s wrong. Headaches fucking HURT, dammit. That’s a quale.
And here’s the point you seem to be missing: Yes, it can make the statement; but no, that does not mean it actually has the required capacities to make the statement true. It’s trivially easy to write a computer program that prints out all manner of statements.