So there is no single macrostate model of the brain determined by its structure. There is always a choice of which coarse-graining to use. Maybe now you can see the problem: if conscious states are computational macrostates, then they are not objectively grounded, because every macrostate exists in the context of a particular coarse-graining, and other ones are always possible.
Here’s the point of divergence. There is peculiar coarse-graining. Specifically it is conceptual self-model consciousness uses to operate on (as a wrote earlier it uses concepts of self, mind, desire, intention, emotion, memory, feeling, etc. When I think “I want to know more”, my consciousness uses concepts of that model to (crudely) represent actual state of (part of) brain including parts which represent model itself). Thus, to find a consciousness in a system it is necessary to find a coarse-graining such that corresponding macrostate of system is isomorphic to physical state of part of the system (it is not sufficient, however). Or in map-territory analogy to find a part of territory that isomorphic to a (crude) map of territory.
Edit: Well, it seems that lower bound on information content of map is necessary for this approach too. However, this approach doesn’t require adding fundamental ontological concepts.
Edit: Isomorphism condition is too limiting, it will require another level of course-graining to be true. I’ll try to come up with another definition.
Here’s the point of divergence. There is peculiar coarse-graining. Specifically it is conceptual self-model consciousness uses to operate on (as a wrote earlier it uses concepts of self, mind, desire, intention, emotion, memory, feeling, etc. When I think “I want to know more”, my consciousness uses concepts of that model to (crudely) represent actual state of (part of) brain including parts which represent model itself). Thus, to find a consciousness in a system it is necessary to find a coarse-graining such that corresponding macrostate of system is isomorphic to physical state of part of the system (it is not sufficient, however). Or in map-territory analogy to find a part of territory that isomorphic to a (crude) map of territory.
Edit: Well, it seems that lower bound on information content of map is necessary for this approach too. However, this approach doesn’t require adding fundamental ontological concepts.
Edit: Isomorphism condition is too limiting, it will require another level of course-graining to be true. I’ll try to come up with another definition.