There’s an aspect of this which I haven’t yet mentioned, which is the following:
We can imagine different strains of functionalism. The weakest would just be: “A person’s mental state supervenes on their (multiply realizable) ‘functional state’.” This leaves the nature of the relation between functional state and mental state utterly mysterious, and thereby leaves the ‘hard problem’ looking as ‘hard’ as it ever did.
But I think a ‘thoroughgoing functionalist’ wants to go further, and say that a person’s mental state is somehow constituted by (or reduces to) the functional state of their brain. It’s not a trivial project to flesh out this idea—not simply to clarify what it means, but to begin to sketch out the functional properties that constitute consciousness—but it’s one that various thinkers (like Minsky and Dennett) have actually taken up.
And if one ends up hypothesising that what’s important for whether a system is ‘conscious’ is (say) whether it represents information a certain way, has a certain kind of ‘higher-order’ access to its own state, or whatever—functional properties which can be scaled up and down in scope and complexity without any obvious ‘thresholds’ being encountered that might correspond to the appearance of consciousness—then one has grounds for saying that there isn’t always a ‘fact of the matter’ as to whether a being is conscious.
I think a ‘thoroughgoing functionalist’ wants to go further, and say that a person’s mental state is somehow constituted by (or reduces to) the functional state of their brain.
Then it’s time to return to the rest of your comment—the whole discussion so far has just been about that one claim, that something can be neither conscious nor not-conscious. So now I’ll quote myself:
The property dualism I’m talking about occurs when basic sensory qualities like color are identified with such computational properties. Either you end up saying “seeing the color is how it feels”—and “feeling” is the extra, dual property—or you say there’s no “feeling” at all—which is denial that consciousness exists. It would be better to be able to assert identity, but then the elements of a conscious experience can’t really be coarse-grained states of neuronal ensembles, etc—that would restore the dualism.
It would be better to be able to assert identity, but then the elements of a conscious experience can’t really be coarse-grained states of neuronal ensembles, etc—that would restore the dualism.
By “coarse-grained states” do you mean that, say, “pain” stands to the many particular neuronal ensembles that could embody pain, in something like the way “human being” stands to all the actual individual human beings? How would that restore a dualism, and what kind of dualism is that?
There’s an aspect of this which I haven’t yet mentioned, which is the following:
We can imagine different strains of functionalism. The weakest would just be: “A person’s mental state supervenes on their (multiply realizable) ‘functional state’.” This leaves the nature of the relation between functional state and mental state utterly mysterious, and thereby leaves the ‘hard problem’ looking as ‘hard’ as it ever did.
But I think a ‘thoroughgoing functionalist’ wants to go further, and say that a person’s mental state is somehow constituted by (or reduces to) the functional state of their brain. It’s not a trivial project to flesh out this idea—not simply to clarify what it means, but to begin to sketch out the functional properties that constitute consciousness—but it’s one that various thinkers (like Minsky and Dennett) have actually taken up.
And if one ends up hypothesising that what’s important for whether a system is ‘conscious’ is (say) whether it represents information a certain way, has a certain kind of ‘higher-order’ access to its own state, or whatever—functional properties which can be scaled up and down in scope and complexity without any obvious ‘thresholds’ being encountered that might correspond to the appearance of consciousness—then one has grounds for saying that there isn’t always a ‘fact of the matter’ as to whether a being is conscious.
Then it’s time to return to the rest of your comment—the whole discussion so far has just been about that one claim, that something can be neither conscious nor not-conscious. So now I’ll quote myself:
By “coarse-grained states” do you mean that, say, “pain” stands to the many particular neuronal ensembles that could embody pain, in something like the way “human being” stands to all the actual individual human beings? How would that restore a dualism, and what kind of dualism is that?