Materialism predicts that algorithms have an “inside”?
Yes. The scene from within a formal system (like algebra) has certain qualities (equations, variables, functions, etc) that are different from the scene outside (markings on paper, equals sign, BEDMAS, variable names, brackets for function application).
That’s not really a materialism thing, it’s a math thing.
As a further note, I’ll have to say that if all the blue and if the red in my visual experience were squitched around, my hunch tells me that I’d be experiencing something different; not just in the sense of different memory associations but that the visual experience itself would be different. It would not just be that “red” is associated with hot, and that “blue” is associated with cold… The qualia of the visual experience itself would be different.
Hence the part where they are compared to other qualia. Maybe that’s not enough, but imagining getting “blue” or “sdfg66df” instead of “red” (which is the evidence you are using) is of course going to return “they are different” because they don’t compare equal. Even if the output of the computation ends up being the same.
That’s not really a materialism thing, it’s a math thing.
I’m under the impression that what you describe falls under computationalism, not materialism, but my reading on these ideas is shallow and I may be confusing some of these terms...
I must say I can’t tell the difference between materialism “the mind is built of stuff” and computationalism “the mind is built of algorithms (running on stuff)”.
Yes. The scene from within a formal system (like algebra) has certain qualities (equations, variables, functions, etc) that are different from the scene outside (markings on paper, equals sign, BEDMAS, variable names, brackets for function application).
That’s not really a materialism thing, it’s a math thing.
Hence the part where they are compared to other qualia. Maybe that’s not enough, but imagining getting “blue” or “sdfg66df” instead of “red” (which is the evidence you are using) is of course going to return “they are different” because they don’t compare equal. Even if the output of the computation ends up being the same.
I’m under the impression that what you describe falls under computationalism, not materialism, but my reading on these ideas is shallow and I may be confusing some of these terms...
I must say I can’t tell the difference between materialism “the mind is built of stuff” and computationalism “the mind is built of algorithms (running on stuff)”.
If I get them confused in some way, sorry.