In order to claim that qualia can’t be duplicated in a qualia simulator, you are claiming that a purely mental property exists, outside physical reality.
Um, no. The whole point of a qualia simulator is that it’s running on a physical substrate which may or may not instantiate qualia itself. This is a property of physical reality, not one of mind or algorithms.
The untrained mind takes this to mean that there must be no qualia in the machine...
Not quite. I may have no direct evidence that the machine has any subjective experience or qualia, but then again, the only subjective experience I have any evidence of is my own. Nevertheless, I can reasonably suppose that subjective experience also applies to other people, because I share a common body structure and biological ancestry with them, and as a physicalist I believe that subjective experience depends on some kind of physical substrate. But as far as behavior goes, I would regard the output of a qualia simulator (say, a whole-brain emulation) as being indistinguishable from any other mind.[1]
that’s how the algorithm feels from the inside
It’s not obvious that an algorithm should feel like anything, per se. Will the algorithm feel the same when run on an Intel Core computer, or a SPARC workstation?
[1]And yes, the simulated minds could talk about their own subjective experiences. If these subjective experiences are instantiated somehow as part of their algorithm, we could then take that as their equivalent to our qualia. But these minds would not share our physical substrate, so the status of these “qualia” would be radically different from our own.
It’s not obvious that an algorithm should feel like anything, per se.
I’m using the phrase in this sense… that is, the distinctions that are available for an algorithm to make—the states that are reachable, in some sense.
Human brains have special states to represent conscious or “intentional” animate entities, so to us it “feels” as though this category is special.
But these minds would not share our physical substrate, so the status of these “qualia” would be radically different from our own.
And what’s your extraordinary evidence for that extraordinary claim? You’re basically claiming that there is something special about human brains that makes them different. Why is that?
How did you arrive at this hypothesis, as opposed to any other?
Obvious answer: you privileged this hypothesis, out of any number of equally complex hypotheses, because it has inutitive appeal. Your brain has a special category for this, so it feels sensible.
But it isn’t sensible as a hypothesis, because if you didn’t have that special category already built into your brain, you would have no reason to single out human brains as being, not only the only substance in all the world that currently has these miraculous qualia, but to also be the only substance that ever will have them.
However, if you set your intuition aside, then there is no other reason whatsoever to single out such an extraordinary hypothesis for special consideration.
But it isn’t sensible as a hypothesis, because if you didn’t have that special category already built into your brain, you would have no reason to single out human brains as being, not only the only substance in all the world that currently has these miraculous qualia, but to also be the only substance that ever will have them.
Obviously, regarding a system as complex as the human brain as “the only substance in all the world that currently has these miraculous qualia” is an unlikely hypothesis. Nevertheless, subjective experience could be instantiated in a simpler physical system as a result of brain activity. There is plenty of precident for biological lifeforms tapping into “exotic” physics for some of their adaptive functions, and subjective experience might be no different.
And what’s your extraordinary evidence for that extraordinary claim?
It’s not an extraordinary claim. As a physicalist, if I’m going to take subjective experience seriously as anything other than what some algorithms (or minds) like to talk about, then it’s reasonable to suppose that the physical substrate matters.
As a physicalist, if I’m going to take subjective experience seriously as anything other than what some algorithms (or minds) like to talk about, then it’s reasonable to suppose that the physical substrate matters.
In a way that produces no distinguishable physical effect?
You seem to be hypothesizing the existence of an invisible dragon. Why? Explain to me how you came to select this hypothesis, out of all similarly complex possible hypotheses.
Like, for example, let’s say you grab a philosopher out of the past who insists that women don’t have men’s reasoning power because, you know, they’re not men, and that surely must be some physical reason why this is so!
Wouldn’t you want to know why he hypothesizes this? What his evidence is? Why he insists that, even if a woman were—hypothetically speaking, in his view—to make the same statements or draw the same conclusions as a man, from the same inputs as a man… then somehow, she still wouldn’t “really” be reasoning like a man, because she has female “qualia” instead of male “qualia”?
Pretty soon, you’d have to come to the conclusion that he’s arguing from the bottom line: trying to provide argumentative support for a conclusion he already had before he started, rather than simply investigating what truth there was to be found.
(Especially if he has nothing in the way of physical evidence for the existence of these “qualia” things… but appears to have just seized on them as a way to justify the already-existing intuitions.)
Um, no. The whole point of a qualia simulator is that it’s running on a physical substrate which may or may not instantiate qualia itself. This is a property of physical reality, not one of mind or algorithms.
Not quite. I may have no direct evidence that the machine has any subjective experience or qualia, but then again, the only subjective experience I have any evidence of is my own. Nevertheless, I can reasonably suppose that subjective experience also applies to other people, because I share a common body structure and biological ancestry with them, and as a physicalist I believe that subjective experience depends on some kind of physical substrate. But as far as behavior goes, I would regard the output of a qualia simulator (say, a whole-brain emulation) as being indistinguishable from any other mind.[1]
It’s not obvious that an algorithm should feel like anything, per se. Will the algorithm feel the same when run on an Intel Core computer, or a SPARC workstation?
[1]And yes, the simulated minds could talk about their own subjective experiences. If these subjective experiences are instantiated somehow as part of their algorithm, we could then take that as their equivalent to our qualia. But these minds would not share our physical substrate, so the status of these “qualia” would be radically different from our own.
I’m using the phrase in this sense… that is, the distinctions that are available for an algorithm to make—the states that are reachable, in some sense.
Human brains have special states to represent conscious or “intentional” animate entities, so to us it “feels” as though this category is special.
And what’s your extraordinary evidence for that extraordinary claim? You’re basically claiming that there is something special about human brains that makes them different. Why is that?
How did you arrive at this hypothesis, as opposed to any other?
Obvious answer: you privileged this hypothesis, out of any number of equally complex hypotheses, because it has inutitive appeal. Your brain has a special category for this, so it feels sensible.
But it isn’t sensible as a hypothesis, because if you didn’t have that special category already built into your brain, you would have no reason to single out human brains as being, not only the only substance in all the world that currently has these miraculous qualia, but to also be the only substance that ever will have them.
However, if you set your intuition aside, then there is no other reason whatsoever to single out such an extraordinary hypothesis for special consideration.
Obviously, regarding a system as complex as the human brain as “the only substance in all the world that currently has these miraculous qualia” is an unlikely hypothesis. Nevertheless, subjective experience could be instantiated in a simpler physical system as a result of brain activity. There is plenty of precident for biological lifeforms tapping into “exotic” physics for some of their adaptive functions, and subjective experience might be no different.
It’s not an extraordinary claim. As a physicalist, if I’m going to take subjective experience seriously as anything other than what some algorithms (or minds) like to talk about, then it’s reasonable to suppose that the physical substrate matters.
In a way that produces no distinguishable physical effect?
You seem to be hypothesizing the existence of an invisible dragon. Why? Explain to me how you came to select this hypothesis, out of all similarly complex possible hypotheses.
Like, for example, let’s say you grab a philosopher out of the past who insists that women don’t have men’s reasoning power because, you know, they’re not men, and that surely must be some physical reason why this is so!
Wouldn’t you want to know why he hypothesizes this? What his evidence is? Why he insists that, even if a woman were—hypothetically speaking, in his view—to make the same statements or draw the same conclusions as a man, from the same inputs as a man… then somehow, she still wouldn’t “really” be reasoning like a man, because she has female “qualia” instead of male “qualia”?
Pretty soon, you’d have to come to the conclusion that he’s arguing from the bottom line: trying to provide argumentative support for a conclusion he already had before he started, rather than simply investigating what truth there was to be found.
(Especially if he has nothing in the way of physical evidence for the existence of these “qualia” things… but appears to have just seized on them as a way to justify the already-existing intuitions.)
Whatever. How about the arguments of qualiaphiles who don’t think simulations will necessarily lack qualia?