Its subject was different, but in my opinion, that’s what qualia are—what it feels like from the inside to see red. You cannot describe it because “red” is the most fundamental category that the brain perceives directly. It does not tell you what that means. With a different mind design, you might have had qualia for frequency. Then that would feel like something fundamental, something that could never be explained to a machine.
But the fact is that if you tell the machine under what circumstances you say that you see red, that is all the information it needs to serve you or even impersonate you. It doesn’t NEED anything else, it hasn’t lost anything of value. Which is, of course, what the Turing Test is all about.
Come to think of it, it seems that with this definition, it might even be possible—albeit pointless—to create a robot that has exactly a human’s qualia. Just make it so it would place colors into discrete buckets, and then fail to connect these buckets with its knowledge of the electromagnetic spectrum.
Also, what I meant by “hubgalopus” is not that subjective experience is one. I meant that when you find yourself unable to decide whether an object has a trait, it’s probably because you have no inklink what the hell you’re looking for. Is is a dot? Or is it a speck? When it’s underwater, does it get wet?.. Choose a frickin definition, and then “does it exist?” will be a simple boolean-valued question.
Tobis: That which makes you suspect that bricks don’t have qualia is probably the objective test you’re looking for.
Eliezer had a post titled “How An Algorithm Feels From Inside”: http://lesswrong.com/lw/no/how_an_algorithm_feels_from_inside/
Its subject was different, but in my opinion, that’s what qualia are—what it feels like from the inside to see red. You cannot describe it because “red” is the most fundamental category that the brain perceives directly. It does not tell you what that means. With a different mind design, you might have had qualia for frequency. Then that would feel like something fundamental, something that could never be explained to a machine.
But the fact is that if you tell the machine under what circumstances you say that you see red, that is all the information it needs to serve you or even impersonate you. It doesn’t NEED anything else, it hasn’t lost anything of value. Which is, of course, what the Turing Test is all about.
Come to think of it, it seems that with this definition, it might even be possible—albeit pointless—to create a robot that has exactly a human’s qualia. Just make it so it would place colors into discrete buckets, and then fail to connect these buckets with its knowledge of the electromagnetic spectrum.
Also, what I meant by “hubgalopus” is not that subjective experience is one. I meant that when you find yourself unable to decide whether an object has a trait, it’s probably because you have no inklink what the hell you’re looking for. Is is a dot? Or is it a speck? When it’s underwater, does it get wet?.. Choose a frickin definition, and then “does it exist?” will be a simple boolean-valued question.