The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.
The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.