humans don’t seem to have continuity of consciousness, in that we sleep
Yes, humans do sleep. Let’s suppose that the consciousness “pauses” or “vanishes” during sleep. Is it how would you define a discontinuity in consciousness? An interval of time delta_t without consciousness separating two conscious processes?
Bergson defines in Time and Free Will: An Essay on the Immediate Data of Consciousness duration as inseparable from consciousness. What would it mean to change consciousness instantaneously with teleportation? Would we need a minimum delta_t for it to make sense (maybe we could infer it from physical constraints given by general relativity?).
Also, it seems plausible that you could fake the subjective continuity of consciousness in a sim.
The question is always how do you fake it. Assuming physicalism, there would be some kind of threshold of cerebral activity which would lead to consciousness. At what point in faking “the subjective continuity of consciousness” do we reach this threshold?
I think my intuition behind the fact that (minimum cerebral activity + continuity of memories) is what lead to the human-like “consciousness” is the first season of WestWorld, where Maeve, a host, becomes progressively self-aware by trying to connect her memories during the entire season (same with Dolores).
Yes, humans do sleep. Let’s suppose that the consciousness “pauses” or “vanishes” during sleep. Is it how would you define a discontinuity in consciousness? An interval of time delta_t without consciousness separating two conscious processes?
Ok.
Bergson defines in Time and Free Will: An Essay on the Immediate Data of Consciousness duration as inseparable from consciousnes
So? Maybe he is wrong. Even if he is right, it only means that duree has to stop when consciousness stops, and so on
What would it mean to change consciousness instantaneously with teleportation?
I don’t see why that arises. If cosnc. is a computational process, and you halt it., when you restart it, it restarts in an identical state, so no change has occurred.
The question is always how do you fake it.
Computationally, that is easy. The halted process has no way of knowing it is not runnig when it is not running, so there is nothing to be papered over.
Assuming physicalism, there would be some kind of threshold of cerebral activity which would lead to consciousness.
Assuming physicalism is true, and computationalism is false … you are not in a simulation.
Aside from that. you seem to think that when I am talking about halting a sim, I am emulating some gradual process like fallign asleep. I’m not.
I think my intuition behind the fact that (minimum cerebral activity + continuity of memories) is what lead to the human-like “consciousness”
I think you are just conflating consciousness (conscious experience) and sense-of-self. It is quite possible to have the one without the other. eg severe amnesiacs are not p-zombies.
Aside from that. you seem to think that when I am talking about halting a sim, I am emulating some gradual process like fallign asleep. I’m not.
I was not thinking that you were talking about a gradual process.
I think you are just conflating consciousness (conscious experience) and sense-of-self. It is quite possible to have the one without the other. eg severe amnesiacs are not p-zombies.
I agree that I am not being clear enough (with myself and with you) and appear to be conflating two concepts. With you’re example of amnesiacs and p-zombies two things come to my mind:
1) p-zombies: when talking about ethics (for instance in my Effective Egoist article) I was aiming at qualias, instead of just sim. conscious agents (with conscious experience as you say). To come back to your first comment, I wanted to say that “identity and continuity of consciousness” contribute to qualia, and make p-zombies less probable.
2) amnesiacs: in my video game I don’t want to play with a world full of amnesiacs. If whenever I ask questions about their past they’re being evasive, it does not feel real enough. I want them to have some memories. Here is a claim:
(P) “For memories to be consistent, the complexity needed would be the same as the complexity needed to emulate the experience which would produce the memory”
I am really unsure about this claim (one could produce fake memories just good enough for people not to notice anything. We don’t have great memories ourselves). However, I think it casts light on what I wanted to express with “The question is always how do you fake it.” Because it must be real/complex enough for them not to notice anything (and the guy in the me-sim. too) but also not too complex (otherwise you could just run full-simulations).
Yes, humans do sleep. Let’s suppose that the consciousness “pauses” or “vanishes” during sleep. Is it how would you define a discontinuity in consciousness? An interval of time delta_t without consciousness separating two conscious processes?
Bergson defines in Time and Free Will: An Essay on the Immediate Data of Consciousness duration as inseparable from consciousness. What would it mean to change consciousness instantaneously with teleportation? Would we need a minimum delta_t for it to make sense (maybe we could infer it from physical constraints given by general relativity?).
The question is always how do you fake it. Assuming physicalism, there would be some kind of threshold of cerebral activity which would lead to consciousness. At what point in faking “the subjective continuity of consciousness” do we reach this threshold?
I think my intuition behind the fact that (minimum cerebral activity + continuity of memories) is what lead to the human-like “consciousness” is the first season of WestWorld, where Maeve, a host, becomes progressively self-aware by trying to connect her memories during the entire season (same with Dolores).
Ok.
So? Maybe he is wrong. Even if he is right, it only means that duree has to stop when consciousness stops, and so on
I don’t see why that arises. If cosnc. is a computational process, and you halt it., when you restart it, it restarts in an identical state, so no change has occurred.
Computationally, that is easy. The halted process has no way of knowing it is not runnig when it is not running, so there is nothing to be papered over.
Assuming physicalism is true, and computationalism is false … you are not in a simulation.
Aside from that. you seem to think that when I am talking about halting a sim, I am emulating some gradual process like fallign asleep. I’m not.
I think you are just conflating consciousness (conscious experience) and sense-of-self. It is quite possible to have the one without the other. eg severe amnesiacs are not p-zombies.
I was not thinking that you were talking about a gradual process.
I agree that I am not being clear enough (with myself and with you) and appear to be conflating two concepts. With you’re example of amnesiacs and p-zombies two things come to my mind:
1) p-zombies: when talking about ethics (for instance in my Effective Egoist article) I was aiming at qualias, instead of just sim. conscious agents (with conscious experience as you say). To come back to your first comment, I wanted to say that “identity and continuity of consciousness” contribute to qualia, and make p-zombies less probable.
2) amnesiacs: in my video game I don’t want to play with a world full of amnesiacs. If whenever I ask questions about their past they’re being evasive, it does not feel real enough. I want them to have some memories. Here is a claim:
I am really unsure about this claim (one could produce fake memories just good enough for people not to notice anything. We don’t have great memories ourselves). However, I think it casts light on what I wanted to express with “The question is always how do you fake it.” Because it must be real/complex enough for them not to notice anything (and the guy in the me-sim. too) but also not too complex (otherwise you could just run full-simulations).
Why? To what extent?
I was trying to use evidence from (presumed) real life.
I don’t know why you would “want” to be in a simulation at all. Most people would find the idea disturbing.