They all seem to be asking variants on the question “how likely is apparent reality real?”. They also all seem to have weird properties as far as evidence is concerned, because the observable evidence must all come from the very source (observed reality) whose credibility we’re questioning.
Also, except for the solipsism one, they seem to be questions where, contrary to LW canon, it might be a good idea to deliberately self-delude (by which I mean, for instance, not bothering to look at the evidence in-depth). If I really felt a .5 probability in my bones that I was living in a simulation, I don’t think I’d be able to work as hard at achieving my goals; I wouldn’t have as much will to power when it could all disappear any moment.
Aside: I’m genuinely surprised at the lack of discussion of lucid dreaming on LW. Lucid dreaming seems like a big gaping loophole in reality, like one of the elements you’d need in a real-life equivalent of the infinite-wish-spell-cycle, yet nobody seems to be seriously experimenting with finding innovative uses for it.
In hindsight, though, it seems like removing the middle question might have been better.
If I really felt a .5 probability in my bones that I was living in a simulation, I don’t think I’d be able to work as hard at achieving my goals; I wouldn’t have as much will to power when it could all disappear any moment.
Would that depend at all on your beliefs about the simulators?
E.g., if you felt a .5 probability that you were in a simulation being run by a real person who shared various important attributes with you, who was attempting to determine the best available strategy for achieving their goals, such that you being successful at achieving yours led directly to them being more successful at achieving theirs, would your motivations change?
I agree that intuitions are challenging here but I really cannot think of a reason to believe that my actions are less meaningful or that reality is any more or less permanent if we’re all being simulated. So maybe there is a tie there to solipsism as I don’t think I have any problems with simulations that are faithfully executing our physics and not making some sort of patchwork Sim in which I’m the only sentient. If I thought Solipsism was .5 probably then I’d have the problem you describe.
They all seem to be asking variants on the question “how likely is apparent reality real?”. They also all seem to have weird properties as far as evidence is concerned, because the observable evidence must all come from the very source (observed reality) whose credibility we’re questioning.
Also, except for the solipsism one, they seem to be questions where, contrary to LW canon, it might be a good idea to deliberately self-delude (by which I mean, for instance, not bothering to look at the evidence in-depth). If I really felt a .5 probability in my bones that I was living in a simulation, I don’t think I’d be able to work as hard at achieving my goals; I wouldn’t have as much will to power when it could all disappear any moment.
Aside: I’m genuinely surprised at the lack of discussion of lucid dreaming on LW. Lucid dreaming seems like a big gaping loophole in reality, like one of the elements you’d need in a real-life equivalent of the infinite-wish-spell-cycle, yet nobody seems to be seriously experimenting with finding innovative uses for it.
In hindsight, though, it seems like removing the middle question might have been better.
Would that depend at all on your beliefs about the simulators?
E.g., if you felt a .5 probability that you were in a simulation being run by a real person who shared various important attributes with you, who was attempting to determine the best available strategy for achieving their goals, such that you being successful at achieving yours led directly to them being more successful at achieving theirs, would your motivations change?
I would like to be working on lucid dreaming research but am unaware of any avenues towards obtaining the very expensive MRI time to do it.
I agree that intuitions are challenging here but I really cannot think of a reason to believe that my actions are less meaningful or that reality is any more or less permanent if we’re all being simulated. So maybe there is a tie there to solipsism as I don’t think I have any problems with simulations that are faithfully executing our physics and not making some sort of patchwork Sim in which I’m the only sentient. If I thought Solipsism was .5 probably then I’d have the problem you describe.