I can definitely create mental models of people who have a pain-analogue which affects their behavior in ways similar to how pain affects mine, without their pain-analogue causing me pain.
there’s no point on reducing this to a minimal Platonic concept of ‘simulating’ in which simulating excruciating pain causes excruciating pain regardless of physiological effects.
I think this is the crux of where we disagree. I don’t think it matters if pain is “physiological” in the sense of being physiologically like how a regular human feels pain. I only care if there is an experience of pain.
I don’t know of any difference between physiological pain and the pain-analogues I inflicted on my mental models which I would accept as necessary for it to qualify as an experience of pain. But since you clearly do think that there is such a difference, what would you say the difference is?
How are you confident that you’ve simulated another conscious being that feels emotions with the same intensity as the ones you would feel if you were in that situation?, instead of just running a low-fidelity simulation with decrease emotional intentisity, which is how it registers within your brain’s memories.
Whatever subjective experience you are simulating, it’s still running in your brain and with the cognitive structures that you have to generate your subjective I (I find this to be the simplest hypothesis), and that means that the simplest conclusion to draw is that whatever your simulation felt gets registered in your brain’s memories, and if you find that those emotions lack much of the intensity that you would experience if you were to be in that situation, that is also the degree of emotional intensity that that being felt while being simulated.
I’d understood that already, but I would need a reason to find that believable, because it seems really unlikely. You are not directly simulating the cognitive structures of the being, it’s impossible, the only way you are simulating someone is by repurposing your cognitive structures to simulate them, and then the intensity of their emotions is the same as what you registered.
How simple do you think the emergency of subjective awareness is?, most people will say that you need dedicated cognitive structures to generate the subjective I, even in theories that are mostly just something like strange loops or higher-level awareness, like HOT or AST, you at least still need a bound locus to experience. If that’s so, then there’s no room for conscious simulacra that feel things that the simulator doesn’t.
This is from a reply that I gave to Vladimir:
I think the main problem here is to simulate beings that are suffering considerably, if you don’t suffer too much while simulating them (which is how most people experience the simulations, except maybe for those that are hyper-empathic or people with really detailed tulpas/personas, maybe) then it’s not a problem.
It might be a problem something like if you consciously create a persona that then you want to delete, and they are aware of it, and they feel bad about it (or more generally, if you know that you’ll create a persona that will suffer because of things, like disliking certain aspects of the world). But you should notice those feelings just like you notice the feelings of any of the ‘conflicting agents’ that you might have in your mind.
I can definitely create mental models of people who have a pain-analogue which affects their behavior in ways similar to how pain affects mine, without their pain-analogue causing me pain.
I think this is the crux of where we disagree. I don’t think it matters if pain is “physiological” in the sense of being physiologically like how a regular human feels pain. I only care if there is an experience of pain.
I don’t know of any difference between physiological pain and the pain-analogues I inflicted on my mental models which I would accept as necessary for it to qualify as an experience of pain. But since you clearly do think that there is such a difference, what would you say the difference is?
How are you confident that you’ve simulated another conscious being that feels emotions with the same intensity as the ones you would feel if you were in that situation?, instead of just running a low-fidelity simulation with decrease emotional intentisity, which is how it registers within your brain’s memories.
Whatever subjective experience you are simulating, it’s still running in your brain and with the cognitive structures that you have to generate your subjective I (I find this to be the simplest hypothesis), and that means that the simplest conclusion to draw is that whatever your simulation felt gets registered in your brain’s memories, and if you find that those emotions lack much of the intensity that you would experience if you were to be in that situation, that is also the degree of emotional intensity that that being felt while being simulated.
Points similar to this have come up in many comments, so I’ve added an addendum at the end of my post where I give my point of view on this.
I’d understood that already, but I would need a reason to find that believable, because it seems really unlikely. You are not directly simulating the cognitive structures of the being, it’s impossible, the only way you are simulating someone is by repurposing your cognitive structures to simulate them, and then the intensity of their emotions is the same as what you registered.
How simple do you think the emergency of subjective awareness is?, most people will say that you need dedicated cognitive structures to generate the subjective I, even in theories that are mostly just something like strange loops or higher-level awareness, like HOT or AST, you at least still need a bound locus to experience. If that’s so, then there’s no room for conscious simulacra that feel things that the simulator doesn’t.
This is from a reply that I gave to Vladimir: