99% is not even what I would put to “it is possible to simulate something as big and complex as the world we are living in, for so long, with that level of consistency and precision.
How easy it is to simulate our universe, depends on how much computing power is available in the parent universe. Where do you get your prior for how big and complex universes usually are?
The simulation argument implies that we should reason the other way around, and assign prior expectations of the complexity of universes to be large enough that simulating our universe is not unusual.
The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.
How easy it is to simulate our universe, depends on how much computing power is available in the parent universe. Where do you get your prior for how big and complex universes usually are?
The simulation argument implies that we should reason the other way around, and assign prior expectations of the complexity of universes to be large enough that simulating our universe is not unusual.
The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.