Computability shows that you can have a classical computer that has the same input/output behavior
That’s what I mean (I’m talking about the input/output behavior of individual neurons).
Input/Output behavior is generally not considered to be enough to guarantee same consciousness
It should be, because it is, in fact, enough. (However, neither the post, nor my comment require that.)
Eliezer himself argued that GLUT isn’t conscious.
Yes, and that’s false (but since that’s not the argument in the OP, I don’t think I should get sidetracked).
But nonetheless, if the only formalized proposal for consciousness doesn’t have the property that simulations preserve consciousness, then clearly the property is not guaranteed.
That’s false. If we assume for a second that the ITT really is the only formalized theory of consciousness, it doesn’t follow that the property is not, in fact, guaranteed. It could also be that the ITT is wrong and that in the actual reality, the property is, in fact, guaranteed.
That’s what I mean (I’m talking about the input/output behavior of individual neurons).
Ah, I see. Nvm then. (I misunderstood the previous comment to apply to the entire brain—idk why, it was pretty clear that you were talking about a single neuron. My bad.)
That’s what I mean (I’m talking about the input/output behavior of individual neurons).
It should be, because it is, in fact, enough. (However, neither the post, nor my comment require that.)
Yes, and that’s false (but since that’s not the argument in the OP, I don’t think I should get sidetracked).
That’s false. If we assume for a second that the ITT really is the only formalized theory of consciousness, it doesn’t follow that the property is not, in fact, guaranteed. It could also be that the ITT is wrong and that in the actual reality, the property is, in fact, guaranteed.
Ah, I see. Nvm then. (I misunderstood the previous comment to apply to the entire brain—idk why, it was pretty clear that you were talking about a single neuron. My bad.)