Yeah I probably should have, thanks for the comment.
What I meant by simulation was whatever model the brain has of itself, and if that was necessary for consciousness (with consciousness I don’t have a really precise definition, but I meant what my experience feels like, being me feels like something, while I’d assume a basic computer program or an object does not feel anything) to arise, and the distinction from that and base reality was where the computing happens (in an abstract way) the brain is computing me and what I’m feeling (the computed is what I mean by simulation). The way it might be testable is that it predicts that if an agent is not modeling himself internally we can rule out that it’s conscious.
Yeah I probably should have, thanks for the comment.
What I meant by simulation was whatever model the brain has of itself, and if that was necessary for consciousness (with consciousness I don’t have a really precise definition, but I meant what my experience feels like, being me feels like something, while I’d assume a basic computer program or an object does not feel anything) to arise, and the distinction from that and base reality was where the computing happens (in an abstract way) the brain is computing me and what I’m feeling (the computed is what I mean by simulation). The way it might be testable is that it predicts that if an agent is not modeling himself internally we can rule out that it’s conscious.