I kinda feel like I literally have more subjective experience after experiencing ego death/rebirth. I suspect that humans vary quite a lot in how often they are conscious, and to what degree. And if you believe, as I do, that consciousness is ultimately algorithmic in nature (like, in the “surfing uncertainty” predictive processing view, that it is a human-modeling thing which models itself to transmit prediction-actions) it would not be crazy for it to be a kind of mental motion which sometimes we do more or less of, and which some people lack entirely.
I don’t draw any moral conclusions about this because I don’t ethically value people or things in proportion to how conscious they are. I love the people I love, I’m certainly happy they are alive, and I would be happier for them to be more alive, but this is not why I love them.
I think you are probably attending more often to sensory experiences, and thereby both creating and remembering more detailed representations of physical reality.
You are probably doing less abstract thought, since the number of seconds in a day hasn’t changed.
Which do you want to spend more time on? And which sorts? It’s a pretty personal question. I like to try to make my abstract thought productive (relative to my values), freeing up some time to enjoy sensory experiences.
I’m not sure there’s a difference in representational density in doing sensory experience vs. abstract thought. Maybe there is. One factor in making it seem like you’re having more sensory experience is how much you can remember after a set amount of time; another is whether each moment seems more intense by having strong emotional experience attached to it.
Or maybe you mean something different by more subjective experience.
You are definitely right about tradeoff of my direct sensory experience vs other things my brain could be doing like calculation or imagination. I hope with practice or clever tool use I will get better at something like doing multiple modes at once, task switching faster between modes, or having a more accurate yet more compressed integrated gestalt self.
tbh, my hidden motivation for writing this is that I find it grating when people say we shouldn’t care how we treat AI because it isn’t conscious. this logic rests on the assumption that consciousness == moral value.
if tomorrow you found out that your mom has stopped experiencing the internal felt sense of “I”, would you stop loving her? would you grieve as if she were dead or comatose?
It depends entirely on what you mean by consciousness. The term is used for several distinct things. If my mom had lost her sense of individuality but was still having a vivid experience of life, I’d keep valuing her. If she was no longer having a subjective experience (which would pretty much require being unconscious since her brain is producing an experience as part of how it works to do stuff), I would no longer value her but consider her already gone.
interesting. what if she has her memories and some abstract theory of what she is, and that theory is about as accurate as anyone else’s theory, but her experiences are not very vivid at all. she’s just going through the motions running on autopilot all the time—like when people get in a kind of trance while driving.
I kinda feel like I literally have more subjective experience after experiencing ego death/rebirth. I suspect that humans vary quite a lot in how often they are conscious, and to what degree. And if you believe, as I do, that consciousness is ultimately algorithmic in nature (like, in the “surfing uncertainty” predictive processing view, that it is a human-modeling thing which models itself to transmit prediction-actions) it would not be crazy for it to be a kind of mental motion which sometimes we do more or less of, and which some people lack entirely.
I don’t draw any moral conclusions about this because I don’t ethically value people or things in proportion to how conscious they are. I love the people I love, I’m certainly happy they are alive, and I would be happier for them to be more alive, but this is not why I love them.
I think you are probably attending more often to sensory experiences, and thereby both creating and remembering more detailed representations of physical reality.
You are probably doing less abstract thought, since the number of seconds in a day hasn’t changed.
Which do you want to spend more time on? And which sorts? It’s a pretty personal question. I like to try to make my abstract thought productive (relative to my values), freeing up some time to enjoy sensory experiences.
I’m not sure there’s a difference in representational density in doing sensory experience vs. abstract thought. Maybe there is. One factor in making it seem like you’re having more sensory experience is how much you can remember after a set amount of time; another is whether each moment seems more intense by having strong emotional experience attached to it.
Or maybe you mean something different by more subjective experience.
You are definitely right about tradeoff of my direct sensory experience vs other things my brain could be doing like calculation or imagination. I hope with practice or clever tool use I will get better at something like doing multiple modes at once, task switching faster between modes, or having a more accurate yet more compressed integrated gestalt self.
tbh, my hidden motivation for writing this is that I find it grating when people say we shouldn’t care how we treat AI because it isn’t conscious. this logic rests on the assumption that consciousness == moral value.
if tomorrow you found out that your mom has stopped experiencing the internal felt sense of “I”, would you stop loving her? would you grieve as if she were dead or comatose?
It depends entirely on what you mean by consciousness. The term is used for several distinct things. If my mom had lost her sense of individuality but was still having a vivid experience of life, I’d keep valuing her. If she was no longer having a subjective experience (which would pretty much require being unconscious since her brain is producing an experience as part of how it works to do stuff), I would no longer value her but consider her already gone.
interesting. what if she has her memories and some abstract theory of what she is, and that theory is about as accurate as anyone else’s theory, but her experiences are not very vivid at all. she’s just going through the motions running on autopilot all the time—like when people get in a kind of trance while driving.