I’ve seen a couple discussions about brain/CNS uploads and there seems to be a common assumption that it would be conscious in the way that we are conscious. I’ve even seen some anthropic principle-esque arguments that we are likely in a simulation because this seems theoretically feasible.
When I think of a simulation, I think of a mathematical isomorphism from the relevant pieces of the brain/CNS onto some model of computation, combined with an isomorphism from possible environments to inputs to this model of computation.
But this model of computation could be anything. It could be a quantum super-computer. It could be a slow classical computer with a huge memory. Heck, it could be written on a huge input file and given to a human that slowly works out the computations by hand.
And so this feels to me like it suggests the mathematical structure itself is conscious, which feels absurd (not to mention that the implications are downright terrifying). So there should be some sort of hardware-dependence to obtain subjective experience. Is this a generally accepted conclusion?
[Question] Isomorphisms don’t preserve subjective experience… right?
I’ve seen a couple discussions about brain/CNS uploads and there seems to be a common assumption that it would be conscious in the way that we are conscious. I’ve even seen some anthropic principle-esque arguments that we are likely in a simulation because this seems theoretically feasible.
When I think of a simulation, I think of a mathematical isomorphism from the relevant pieces of the brain/CNS onto some model of computation, combined with an isomorphism from possible environments to inputs to this model of computation.
But this model of computation could be anything. It could be a quantum super-computer. It could be a slow classical computer with a huge memory. Heck, it could be written on a huge input file and given to a human that slowly works out the computations by hand.
And so this feels to me like it suggests the mathematical structure itself is conscious, which feels absurd (not to mention that the implications are downright terrifying). So there should be some sort of hardware-dependence to obtain subjective experience. Is this a generally accepted conclusion?