I’m uncertain in which sense someone could be said to figure that out and still have a will of their own. It’s a bit beyond merely living in a simulation.
If ‘free will’ is compatible with physical determinism (including the quantum variety) then why can it not be similarly compatible with living in a world based on some guy’s thoughts? The same principles seem to apply.
I think the problem is a lack of detail. Harry isn’t being simulated down to the neuronal level, or even down to the brain region. ‘He’ is a loose set of rules and free-floating ideas that please Eliezer or survived his theories, a very small entity indeed. And the rest of his world is even more impoverished—the rest of the world may just be a few words like ‘the rest of the world’. Harry can’t even execute bounded loops unless Eliezer feels like formulating them and actually executing them.
If Harry discovered he were in fiction, his motivation to help the rest of the world would instantly vanish. In fact, the most moral thing he could do is to hide in his trunk forever—if a rape only happens when you go to rescue the rapee and the narration follows you, if the murders only happen because you went looking for murders, then out of sight, out of mind, out of reality. In a ‘real’ simulation, this would not be the case, even if the author would never permit a character to test this.
(He might still want to escape into our world if the author desires him to desire this, but help his world? His world can no more be helped than J.K Rowling can help Zanzibar in canon HP. There is no there there.)
I think fictional characters can be more than that. I don’t know how Eliezar experiences Harry, but some authors talk about their characters talking back to them, or resisting some plot twists.
This suggests to me that some characters are similar to full human self-images, though with less memories.
Whatever his writer wants him to?
I’m uncertain in which sense someone could be said to figure that out and still have a will of their own. It’s a bit beyond merely living in a simulation.
If ‘free will’ is compatible with physical determinism (including the quantum variety) then why can it not be similarly compatible with living in a world based on some guy’s thoughts? The same principles seem to apply.
I think the problem is a lack of detail. Harry isn’t being simulated down to the neuronal level, or even down to the brain region. ‘He’ is a loose set of rules and free-floating ideas that please Eliezer or survived his theories, a very small entity indeed. And the rest of his world is even more impoverished—the rest of the world may just be a few words like ‘the rest of the world’. Harry can’t even execute bounded loops unless Eliezer feels like formulating them and actually executing them.
If Harry discovered he were in fiction, his motivation to help the rest of the world would instantly vanish. In fact, the most moral thing he could do is to hide in his trunk forever—if a rape only happens when you go to rescue the rapee and the narration follows you, if the murders only happen because you went looking for murders, then out of sight, out of mind, out of reality. In a ‘real’ simulation, this would not be the case, even if the author would never permit a character to test this.
(He might still want to escape into our world if the author desires him to desire this, but help his world? His world can no more be helped than J.K Rowling can help Zanzibar in canon HP. There is no there there.)
I think fictional characters can be more than that. I don’t know how Eliezar experiences Harry, but some authors talk about their characters talking back to them, or resisting some plot twists.
This suggests to me that some characters are similar to full human self-images, though with less memories.
Haha, so Harry can “truly escape” by means of Eliezer going mad and imagining himself to be the escaped Harry. Or maybe they could time-share.
I believe I’m speaking for all of us in stating that I hope he isn’t aiming for that end. ;)