I’d understood that already, but I would need a reason to find that believable, because it seems really unlikely. You are not directly simulating the cognitive structures of the being, it’s impossible, the only way you are simulating someone is by repurposing your cognitive structures to simulate them, and then the intensity of their emotions is the same as what you registered.
How simple do you think the emergency of subjective awareness is?, most people will say that you need dedicated cognitive structures to generate the subjective I, even in theories that are mostly just something like strange loops or higher-level awareness, like HOT or AST, you at least still need a bound locus to experience. If that’s so, then there’s no room for conscious simulacra that feel things that the simulator doesn’t.
This is from a reply that I gave to Vladimir:
I think the main problem here is to simulate beings that are suffering considerably, if you don’t suffer too much while simulating them (which is how most people experience the simulations, except maybe for those that are hyper-empathic or people with really detailed tulpas/personas, maybe) then it’s not a problem.
It might be a problem something like if you consciously create a persona that then you want to delete, and they are aware of it, and they feel bad about it (or more generally, if you know that you’ll create a persona that will suffer because of things, like disliking certain aspects of the world). But you should notice those feelings just like you notice the feelings of any of the ‘conflicting agents’ that you might have in your mind.
Points similar to this have come up in many comments, so I’ve added an addendum at the end of my post where I give my point of view on this.
I’d understood that already, but I would need a reason to find that believable, because it seems really unlikely. You are not directly simulating the cognitive structures of the being, it’s impossible, the only way you are simulating someone is by repurposing your cognitive structures to simulate them, and then the intensity of their emotions is the same as what you registered.
How simple do you think the emergency of subjective awareness is?, most people will say that you need dedicated cognitive structures to generate the subjective I, even in theories that are mostly just something like strange loops or higher-level awareness, like HOT or AST, you at least still need a bound locus to experience. If that’s so, then there’s no room for conscious simulacra that feel things that the simulator doesn’t.
This is from a reply that I gave to Vladimir: