“From the inside, the program experiences no mechanisms of reduction of these atomic qualia”
Materialism predicts that algorithms have an “inside”?
As a further note, I’ll have to say that if all the blue and if the red in my visual experience were switched around, my hunch tells me that I’d be experiencing something different; not just in the sense of different memory associations but that the visual experience itself would be different. It would not just be that “red” is associated with hot, and that “blue” is associated with cold… The qualia of the visual experience itself would be different.
Materialism predicts that algorithms have an “inside”?
Yes. The scene from within a formal system (like algebra) has certain qualities (equations, variables, functions, etc) that are different from the scene outside (markings on paper, equals sign, BEDMAS, variable names, brackets for function application).
That’s not really a materialism thing, it’s a math thing.
As a further note, I’ll have to say that if all the blue and if the red in my visual experience were squitched around, my hunch tells me that I’d be experiencing something different; not just in the sense of different memory associations but that the visual experience itself would be different. It would not just be that “red” is associated with hot, and that “blue” is associated with cold… The qualia of the visual experience itself would be different.
Hence the part where they are compared to other qualia. Maybe that’s not enough, but imagining getting “blue” or “sdfg66df” instead of “red” (which is the evidence you are using) is of course going to return “they are different” because they don’t compare equal. Even if the output of the computation ends up being the same.
That’s not really a materialism thing, it’s a math thing.
I’m under the impression that what you describe falls under computationalism, not materialism, but my reading on these ideas is shallow and I may be confusing some of these terms...
I must say I can’t tell the difference between materialism “the mind is built of stuff” and computationalism “the mind is built of algorithms (running on stuff)”.
That thought experiment doesn’t make much sense. If the experiences were somehow switched, but everything else kept the same (i .e all your memories and associations of red are still connected to each other and everything else in the same way) you wouldn’t notice the difference; everything would still match your memories exactly. If there even is such a thing as raw qualia there is no reason to suppose they are stable from one moment to the other; as long as the correct network of associations is triggered there is no evolutionary advantage either way.
Materialism predicts that algorithms have an “inside”?
As a further note, I’ll have to say that if all the blue and if the red in my visual experience were switched around, my hunch tells me that I’d be experiencing something different; not just in the sense of different memory associations but that the visual experience itself would be different. It would not just be that “red” is associated with hot, and that “blue” is associated with cold… The qualia of the visual experience itself would be different.
Yes. The scene from within a formal system (like algebra) has certain qualities (equations, variables, functions, etc) that are different from the scene outside (markings on paper, equals sign, BEDMAS, variable names, brackets for function application).
That’s not really a materialism thing, it’s a math thing.
Hence the part where they are compared to other qualia. Maybe that’s not enough, but imagining getting “blue” or “sdfg66df” instead of “red” (which is the evidence you are using) is of course going to return “they are different” because they don’t compare equal. Even if the output of the computation ends up being the same.
I’m under the impression that what you describe falls under computationalism, not materialism, but my reading on these ideas is shallow and I may be confusing some of these terms...
I must say I can’t tell the difference between materialism “the mind is built of stuff” and computationalism “the mind is built of algorithms (running on stuff)”.
If I get them confused in some way, sorry.
That thought experiment doesn’t make much sense. If the experiences were somehow switched, but everything else kept the same (i .e all your memories and associations of red are still connected to each other and everything else in the same way) you wouldn’t notice the difference; everything would still match your memories exactly. If there even is such a thing as raw qualia there is no reason to suppose they are stable from one moment to the other; as long as the correct network of associations is triggered there is no evolutionary advantage either way.