But isn’t it still possible that a simulation that lost its consciousness would still retain memories about consciousness that were sufficient, even without access to real consciousness, to generate potentially even ‘novel’ content about consciousness?
That’s possible, although then the consciousness-related utterances would be of the form “oh my, I seem to have suddenly stopped being conscious” or the like (if you believe that consciousness plays a causal role in human utterances such as “yep, i introspected on my consciousness and it’s still there”), implying that such a simulation would not have been a faithful synaptic-level WBE, having clearly differing macro-level behaviour.
But isn’t it still possible that a simulation that lost its consciousness would still retain memories about consciousness that were sufficient, even without access to real consciousness, to generate potentially even ‘novel’ content about consciousness?
That’s possible, although then the consciousness-related utterances would be of the form “oh my, I seem to have suddenly stopped being conscious” or the like (if you believe that consciousness plays a causal role in human utterances such as “yep, i introspected on my consciousness and it’s still there”), implying that such a simulation would not have been a faithful synaptic-level WBE, having clearly differing macro-level behaviour.