I accept the second horn of the dilemma. Consciousness is not computation abstracted from substrate. Computational capabilities alone never imply consciousness. States of consciousness are what they are, objectively and intrinsically; states of computation depend on semantic imputation by an observer and on underdetermined coarse-graining of physical state space.
You could have a dualism with a bridging law which associated states of consciousness with a particular arbitrary refinement of a computational coarse-graining to the point of completeness, but (i) it would still not be identity (ii) it would be exceedingly complicated. So, especially given the hints of ontological nonlocality we get from quantum theory, I think it better to look for a new ontology in which the mind is not presupposed to be an aggregate of spatial parts to begin with. This may or may not involve new physics, in the sense of new mathematics; it may be that the change in perspective required involves no more extra formalism than does MWI. It does, very probably, require new biophysics and neuroscience, in the form of mesoscopic quantum phenomena in the brain that are functionally relevant to cognition. And it very definitely requires backing away from the attempt to reduce consciousness to combinations of known physical properties. If anything, we have to go the other way: understand consciousness in itself, to the extent that that is possible, and then use that to understand what physical properties actually “are”, once the conscious mind has been identified with a particular part of the brain as formally described by physics.
None of that implies that the physics of consciousness is noncomputable, by the way, in the sense of being susceptible to exact simulation, so I will demur from statement 2. Consciousness involves a series of states; it can formally be described in terms of state transitions; and so it can be simulated on a Turing machine, unless there really is some Turing-busting basic cognitive operation, like Feferman reflection. But it won’t be consciousness unless it’s happening on the right substrate—e.g. one irreducible quantum tensor factor, rather than a product of them—that’s the implication. But it’s hard to see that if you think of the formal mathematical language—“tensor factor”—as being the fundamental description, and the psychologistic language—“intentionality”—as phlogiston-talk. The actual nature of consciousness is expressed by concepts like intentionality, qualia, etc., and any description in terms of Hilbert spaces (and so forth) will be purely formal and dynamical. This is why ontology is more fundamental than physics (as physics is presently understood).
I accept your warning (as Yu’el) that maybe things are radically other than I have ever imagined. I don’t insist that this, for example, is definitely the right way ahead, far from it. But I have considerable confidence that most existing ideas about how to fit the mind into natural science are wrong, as they involve either spurious identities which break down on examination, or outright denial of phenomenological facts that are ontologically inconvenient.
I accept the second horn of the dilemma. Consciousness is not computation abstracted from substrate. Computational capabilities alone never imply consciousness. States of consciousness are what they are, objectively and intrinsically; states of computation depend on semantic imputation by an observer and on underdetermined coarse-graining of physical state space.
You could have a dualism with a bridging law which associated states of consciousness with a particular arbitrary refinement of a computational coarse-graining to the point of completeness, but (i) it would still not be identity (ii) it would be exceedingly complicated. So, especially given the hints of ontological nonlocality we get from quantum theory, I think it better to look for a new ontology in which the mind is not presupposed to be an aggregate of spatial parts to begin with. This may or may not involve new physics, in the sense of new mathematics; it may be that the change in perspective required involves no more extra formalism than does MWI. It does, very probably, require new biophysics and neuroscience, in the form of mesoscopic quantum phenomena in the brain that are functionally relevant to cognition. And it very definitely requires backing away from the attempt to reduce consciousness to combinations of known physical properties. If anything, we have to go the other way: understand consciousness in itself, to the extent that that is possible, and then use that to understand what physical properties actually “are”, once the conscious mind has been identified with a particular part of the brain as formally described by physics.
None of that implies that the physics of consciousness is noncomputable, by the way, in the sense of being susceptible to exact simulation, so I will demur from statement 2. Consciousness involves a series of states; it can formally be described in terms of state transitions; and so it can be simulated on a Turing machine, unless there really is some Turing-busting basic cognitive operation, like Feferman reflection. But it won’t be consciousness unless it’s happening on the right substrate—e.g. one irreducible quantum tensor factor, rather than a product of them—that’s the implication. But it’s hard to see that if you think of the formal mathematical language—“tensor factor”—as being the fundamental description, and the psychologistic language—“intentionality”—as phlogiston-talk. The actual nature of consciousness is expressed by concepts like intentionality, qualia, etc., and any description in terms of Hilbert spaces (and so forth) will be purely formal and dynamical. This is why ontology is more fundamental than physics (as physics is presently understood).
I accept your warning (as Yu’el) that maybe things are radically other than I have ever imagined. I don’t insist that this, for example, is definitely the right way ahead, far from it. But I have considerable confidence that most existing ideas about how to fit the mind into natural science are wrong, as they involve either spurious identities which break down on examination, or outright denial of phenomenological facts that are ontologically inconvenient.