I agree that “practical CF” as thus defined is false—indeed I think it’s so obviously false that this post is massive overkill in justifying it.
But I also think that “practical CF” as thus defined is not in fact a claim that computational functionalists tend to make.
The term ‘functionalist’ is overloaded. A lot of philosophical terms are overloaded, but ‘functionalist’ is the most egregiously overloaded of all philosophical terms because it refers to two groups of people with two literally incompatible sets of beliefs:
(1) the people who are consciousness realists and think there’s this well-defined consciousness stuff exhibited from human brains, and also that the way this stuff emerges depends on what computational steps/functions/algorithms are executed (whatever that means exactly)
(2) the people who think consciousness is only an intuitive model, in which case functionalism is kinda trivial and not really a thing that can be proved or disproved, anyway
Unless I’m misinterpreting things here (and OP can correct me if I am), the post is arguing against (1), but you are (2), which is why you’re talking past each other here. (I don’t think this sequence in general is relevant to your personal views, which is what I also tried to say here.) In the definition you rephrased
would cause the same conscious experience as that brain, in the specific sense of thinking literally the exact same sequence of thoughts in the exact same order, in perpetuity.
… consciousness realists will read the ‘thinking’ part as referring to thinking in the conscious mind, not to thinking in the physical brain. So to you this reads obviously false to you because you don’t think there is a conscious mind separate from the physical brain, and the thoughts in the physical brain aren’t ‘literally exactly the same’ in the biological brain vs. the simulation—obviously! But the (1) group does, in fact, believe in such a thing, and their position does more or less imply that it would be thinking the same thoughts.
I believe this is what OP is trying to gesture at as well with their reply here.
The term ‘functionalist’ is overloaded. A lot of philosophical terms are overloaded, but ‘functionalist’ is the most egregiously overloaded of all philosophical terms because it refers to two groups of people with two literally incompatible sets of beliefs:
(1) the people who are consciousness realists and think there’s this well-defined consciousness stuff exhibited from human brains, and also that the way this stuff emerges depends on what computational steps/functions/algorithms are executed (whatever that means exactly)
(2) the people who think consciousness is only an intuitive model, in which case functionalism is kinda trivial and not really a thing that can be proved or disproved, anyway
Unless I’m misinterpreting things here (and OP can correct me if I am), the post is arguing against (1), but you are (2), which is why you’re talking past each other here. (I don’t think this sequence in general is relevant to your personal views, which is what I also tried to say here.) In the definition you rephrased
… consciousness realists will read the ‘thinking’ part as referring to thinking in the conscious mind, not to thinking in the physical brain. So to you this reads obviously false to you because you don’t think there is a conscious mind separate from the physical brain, and the thoughts in the physical brain aren’t ‘literally exactly the same’ in the biological brain vs. the simulation—obviously! But the (1) group does, in fact, believe in such a thing, and their position does more or less imply that it would be thinking the same thoughts.
I believe this is what OP is trying to gesture at as well with their reply here.