I seriously don’t know whether subjective experience is mostly independent of hardware implementation or not. I don’t think we can know for sure.
However: If we take the position that conscious experience is strongly connected with behaviour such as writing about those conscious experiences, then it has to be largely hardware-independent since the behaviour of an isomorphic system is identical.
So my expectation is that it probably is hardware-independent, and that any system that internally implements isomorphic behaviour probably is at least very similarly conscious.
In any event, we should probably treat them as being conscious even if we can’t be certain. After all, none of us can be truly certain that any other humans are conscious. They certainly behave as if they are, but that leads back to ”… and so does any other isomorphic system”.
I seriously don’t know whether subjective experience is mostly independent of hardware implementation or not. I don’t think we can know for sure.
However: If we take the position that conscious experience is strongly connected with behaviour such as writing about those conscious experiences, then it has to be largely hardware-independent since the behaviour of an isomorphic system is identical.
So my expectation is that it probably is hardware-independent, and that any system that internally implements isomorphic behaviour probably is at least very similarly conscious.
In any event, we should probably treat them as being conscious even if we can’t be certain. After all, none of us can be truly certain that any other humans are conscious. They certainly behave as if they are, but that leads back to ”… and so does any other isomorphic system”.