a computer program could make those judgements (sic) without actually experiencing any of those qualia
Just as an FYI, this is the place where your intuition is blindsiding you. Intuitively, you “know” that a computer isn’t experiencing anything… and that’s what your entire argument rests on.
However, this “knowing” is just an assumption, and it’s assuming the very thing that is the question: does it make sense to speak of a computer experiencing something?
And there is no reason apart from that intuition/assumption, to treat this as a different question from, “does it make sense to speak of a brain experiencing something?”.
IOW, substitute “brain” for every use of “computer” or “simulation”, and make the same assertions. “The brain is just calculating what feelings and qualia it should have, not really experiencing them. After all, it is just a physical system of chemicals and electrical impulses. Clearly, it is foolish to think that it could thereby experience anything.”
By making brains special, you’re privileging the qualia hypothesis based on an intuitive assumption.
I don’t think you read my post very carefully. I didn’t claim that qualia are a phenomenon unique to human brains. I claimed that human-like qualia are a phenomenon unique to human brains. Computers might very well experience qualia; so might a lump of coal. But if you think a computer simulation of a human experiences the same qualia as a human, while a lump of coal experiences no qualia or different ones, you need to make that case to me.
But if you think a computer simulation of a human experiences the same qualia as a human, while a lump of coal experiences no qualia or different ones, you need to make that case to me.
Actually, I’d say you need to make a case for WTF “qualia” means in the first place. As far as I’ve ever seen, it seems to be one of those words that people use as a handwavy thing to prove the specialness of humans. When we know what “human qualia” reduce to, specifically, then we’ll be able to simulate them.
That’s actually a pretty good operational definition of “reduce”, actually. ;-) (Not to mention “know”.)
Just as an FYI, this is the place where your intuition is blindsiding you. Intuitively, you “know” that a computer isn’t experiencing anything… and that’s what your entire argument rests on.
However, this “knowing” is just an assumption, and it’s assuming the very thing that is the question: does it make sense to speak of a computer experiencing something?
And there is no reason apart from that intuition/assumption, to treat this as a different question from, “does it make sense to speak of a brain experiencing something?”.
IOW, substitute “brain” for every use of “computer” or “simulation”, and make the same assertions. “The brain is just calculating what feelings and qualia it should have, not really experiencing them. After all, it is just a physical system of chemicals and electrical impulses. Clearly, it is foolish to think that it could thereby experience anything.”
By making brains special, you’re privileging the qualia hypothesis based on an intuitive assumption.
I don’t think you read my post very carefully. I didn’t claim that qualia are a phenomenon unique to human brains. I claimed that human-like qualia are a phenomenon unique to human brains. Computers might very well experience qualia; so might a lump of coal. But if you think a computer simulation of a human experiences the same qualia as a human, while a lump of coal experiences no qualia or different ones, you need to make that case to me.
Actually, I’d say you need to make a case for WTF “qualia” means in the first place. As far as I’ve ever seen, it seems to be one of those words that people use as a handwavy thing to prove the specialness of humans. When we know what “human qualia” reduce to, specifically, then we’ll be able to simulate them.
That’s actually a pretty good operational definition of “reduce”, actually. ;-) (Not to mention “know”.)