If we are to define some prior distribution over what exists, out beyond what we can see, kolmogorov complexity seems like a sensible metric to use. A universe generated by a small machine is much more likely a-priori—perhaps we should assume it occurs with much greater frequency—than a universe that can only be generated by a large machine.
But an unsimulated universe is likeliest of all, by the same reasoning.
Actually, you don’t need to use K complexity specifically...most versions of occams razor weigh against a simulated universe.
I propose that the maximum number of simple simulated universes that could be hosted within a supercomplex universe is unlikely to outnumber the natural instances of simple universes that lay about in the multiverse’s bulk.
That’s uncountable infinity in many versions of MW theory,so it’s hard to exceed. But if you are going to treat MW theory as the main alternative to simulationism, you need to argue for it to some extent.
But an unsimulated universe is likeliest of all, by the same reasoning.
Actually, you don’t need to use K complexity specifically...most versions of occams razor weigh against a simulated universe.
That’s uncountable infinity in many versions of MW theory,so it’s hard to exceed. But if you are going to treat MW theory as the main alternative to simulationism, you need to argue for it to some extent.