In terms of informational simplicity, there’s a measurement bound at the limits of your memory of perceptions. You can’t distinguish between a simulation of quantum or atomic (or even simple mechanical) rules, from a tiny simulation of point-in-time set of memories and experience. You literally cannot know the difference (and there may be no difference—see the philosophical zombies debate) between an NPC and a “real person”.
If it turns out that this is your personal Basilisk punishment, and you’re being tortured for not bringing forth the creator in another universe, there is no theoretical or practical way to know that.
(Are trigger warnings still a thing? it occurs to me that this topic may interact badly with suicidal thoughts. Please take it only as the silly exploration of imagination-space that it is).
I don’t give a lot of weight to the basilisk possibility—that was something of a throwaway comment.
What I meant is that if the truth is that the simulation controller is specifically interested in you as an experience-haver within the simulation, then there is no possibility to intentionally influence the simulation. Your perceptions and your cognition will be manipulated to make you believe whatever the simulator things furthers their goals. Your universe of perception and possible actions simply won’t contain things that counter their goals.
And one amusing (perhaps only to me) possibility of motivation for such a personal simulation is that my current life is the worst that the attacker can imagine within the constraints that I must believe it’s real. Maybe I’m being tortured, as punishment for some outside-universe crime. I find it amusing, as I assign positive value to this moment of experience (which is the only thing I can be sure of), so the basilisk is being thwarted in it’s mission of punishing me. And also because it seems ludicrously unlikely.
In terms of informational simplicity, there’s a measurement bound at the limits of your memory of perceptions. You can’t distinguish between a simulation of quantum or atomic (or even simple mechanical) rules, from a tiny simulation of point-in-time set of memories and experience. You literally cannot know the difference (and there may be no difference—see the philosophical zombies debate) between an NPC and a “real person”.
If it turns out that this is your personal Basilisk punishment, and you’re being tortured for not bringing forth the creator in another universe, there is no theoretical or practical way to know that.
Can you elaborate on that last sentence that seems like I’m going to like what you can say about it.
(Are trigger warnings still a thing? it occurs to me that this topic may interact badly with suicidal thoughts. Please take it only as the silly exploration of imagination-space that it is).
I don’t give a lot of weight to the basilisk possibility—that was something of a throwaway comment.
What I meant is that if the truth is that the simulation controller is specifically interested in you as an experience-haver within the simulation, then there is no possibility to intentionally influence the simulation. Your perceptions and your cognition will be manipulated to make you believe whatever the simulator things furthers their goals. Your universe of perception and possible actions simply won’t contain things that counter their goals.
And one amusing (perhaps only to me) possibility of motivation for such a personal simulation is that my current life is the worst that the attacker can imagine within the constraints that I must believe it’s real. Maybe I’m being tortured, as punishment for some outside-universe crime. I find it amusing, as I assign positive value to this moment of experience (which is the only thing I can be sure of), so the basilisk is being thwarted in it’s mission of punishing me. And also because it seems ludicrously unlikely.