Being in a simulation within a simulation (nested to any level) implies being in a simulation. The proper decomposition is p = sum over all positive N of (probability of simulation nested to level N)
The top simulator has N operations to execute before his free enthalpy basin is empty.
Every level down, this number is smaller. Before long, there is impossible to create a nontrivial simulation inside the current. This is the bottom one.
This simulation tower is just a great way to squander all the free enthalpy you have. Is the top simulation master that stupid?
In that sense, there’s actually a significant risk to the singularity. Why should the simulation master (I usually facetiously use the phrase “our overlords” when referring to this entity) let us ever run a simulation that is likely to result in an infinitely nested simulation? Maybe that’s why the LHC keeps blowing up.
You also need to include scenarios for infinitely-high towers, or closed-loop towers, or branching and merging networks, or one simulation being run in several (perhaps infinitely many) simulating worlds, or the other way around...
I don’t think we can assign a meaningful prior to any of these, and so we can’t calculate the probability of being in a simulation.
I don’t think the probability calculation is meaningful because the infinities mess it up. But you still need to ask, are you in the original 2010 or one of infinitely many possible ways to be in a simulated 2010? I can’t assign a probability; but I have a strong intuition when comparing one to infinite.
If the probability, that you are inside a simulation is p, what’s the probability that your master simulator is also simulated?
How tall is this tower, most likely?
Being in a simulation within a simulation (nested to any level) implies being in a simulation. The proper decomposition is p = sum over all positive N of (probability of simulation nested to level N)
The top simulator has N operations to execute before his free enthalpy basin is empty.
Every level down, this number is smaller. Before long, there is impossible to create a nontrivial simulation inside the current. This is the bottom one.
This simulation tower is just a great way to squander all the free enthalpy you have. Is the top simulation master that stupid?
I doubt it.
In that sense, there’s actually a significant risk to the singularity. Why should the simulation master (I usually facetiously use the phrase “our overlords” when referring to this entity) let us ever run a simulation that is likely to result in an infinitely nested simulation? Maybe that’s why the LHC keeps blowing up.
You also need to include scenarios for infinitely-high towers, or closed-loop towers, or branching and merging networks, or one simulation being run in several (perhaps infinitely many) simulating worlds, or the other way around...
I don’t think we can assign a meaningful prior to any of these, and so we can’t calculate the probability of being in a simulation.
I don’t think the probability calculation is meaningful because the infinities mess it up. But you still need to ask, are you in the original 2010 or one of infinitely many possible ways to be in a simulated 2010? I can’t assign a probability; but I have a strong intuition when comparing one to infinite.