I like thinking about being in a simulation, and since it makes no practical difference (except if you go crazy and think it’s a good idea to test every possible means of ‘praying’ to any possible interested and intervening simulator god), I don’t think we need to agree on the odds that we are simulated.
However, I’d say that it seems impossible to me to defend any particular choice of prior probability for the simulation vs. non-simulation cases. So while it matters how well such a hypothesis explains the data, I have no idea if I should be raising p(simulation) by 1000db from −10db or from −10000000db. If you have 1000db worth of predictions following from a disjunction over possible simulations, then that’s of course super interesting and amusing even if I can’t decide what my prior belief is.
I’m sure there’s more to it than came across in that sentence, but that sounds like shaky grounds for belief.
Scientifically it’s bunk but Bayesically it seems sound to me. A simple hypothesis that explains many otherwise unlikely pieces of evidence.
That said, I do have other reasons, but explaining the intuitions would not fit within the margins of my time.
I like thinking about being in a simulation, and since it makes no practical difference (except if you go crazy and think it’s a good idea to test every possible means of ‘praying’ to any possible interested and intervening simulator god), I don’t think we need to agree on the odds that we are simulated.
However, I’d say that it seems impossible to me to defend any particular choice of prior probability for the simulation vs. non-simulation cases. So while it matters how well such a hypothesis explains the data, I have no idea if I should be raising p(simulation) by 1000db from −10db or from −10000000db. If you have 1000db worth of predictions following from a disjunction over possible simulations, then that’s of course super interesting and amusing even if I can’t decide what my prior belief is.