P(sim) : 99% seems way too high − 99% is not even what I would put to “it is possible to simulate something as big and complex as the world we are living in, for so long, with that level of consistency and precision”. Just for the possibility of a simulation of us being possible wouldn’t go above 95% to me. That we are inside a simulation, much less.
P(ent|sim) : most computers are used for entertainment (and even that isn’t sure, many personal computers are also used for “serious” business), but the most powerful ones are not—and making a simulation of something big and complex like Earth sounds like complex and requiring a powerful computer.
P(ego|ent, sim) : actually, very few games imply a god-like player. Some strategy games do (more or less obviously), but most don’t. In a CRPG, players are heroes, not gods. In an action game, player is a fighter, space ship pilot, car racer, … but definitely not a god-like character. And it gets even much less common to have the player god-like when you look at multi-player games, which seems to be more and more the popular games (and to go back to previous point, if a computer is powerful enough to simulate a whole world just for fun, it sounds more likely to me that’s it’s a MMORPG-like).
P(follow-thru | chr0, ego, ent, sim, Earth) : this one seems very clearly overevaluated, by far. When players are done with a game, they don’t care anymore about the NPCs of the game. They never allow computing power to let the NPC runs after they are done playing. If the player realize that the NPC is self-aware, he could behave differently (but that implies knowing the ethics of the ones running the simulator), but then, why would he only grant immortality to those playing on his side ? If he’s ethical enough to spend computing power to allow NPCs to continue their lives, why not all of them ?
Also another point is missing : what if humans are meaningless for the people running the simulation ? The simulation’s goal, even if it’s entertainment, could be a Master of Orion game, with very distant galactic empires controlled by players, and Earth just being a minor NPC race, not yet discovered by any player (and which may even stay undiscovered till the end of the game). So we should add P(Earth is central| ego, ent, sim) which wouldn’t be that big to me, knowing that there are about 10^21 stars in observable universe.
At the end, well, I could give my own estimates, but there are so many wild guess about the estimates (especially since, if we’re in a simulation, we don’t know much about the rules of the “real” universe, so speculating about it is pointless), that I don’t see how the figure coming out such an estimate would have any real meaning.
99% is not even what I would put to “it is possible to simulate something as big and complex as the world we are living in, for so long, with that level of consistency and precision.
How easy it is to simulate our universe, depends on how much computing power is available in the parent universe. Where do you get your prior for how big and complex universes usually are?
The simulation argument implies that we should reason the other way around, and assign prior expectations of the complexity of universes to be large enough that simulating our universe is not unusual.
The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.
Hum, I don’t agree with many of your estimates.
P(sim) : 99% seems way too high − 99% is not even what I would put to “it is possible to simulate something as big and complex as the world we are living in, for so long, with that level of consistency and precision”. Just for the possibility of a simulation of us being possible wouldn’t go above 95% to me. That we are inside a simulation, much less.
P(ent|sim) : most computers are used for entertainment (and even that isn’t sure, many personal computers are also used for “serious” business), but the most powerful ones are not—and making a simulation of something big and complex like Earth sounds like complex and requiring a powerful computer.
P(ego|ent, sim) : actually, very few games imply a god-like player. Some strategy games do (more or less obviously), but most don’t. In a CRPG, players are heroes, not gods. In an action game, player is a fighter, space ship pilot, car racer, … but definitely not a god-like character. And it gets even much less common to have the player god-like when you look at multi-player games, which seems to be more and more the popular games (and to go back to previous point, if a computer is powerful enough to simulate a whole world just for fun, it sounds more likely to me that’s it’s a MMORPG-like).
P(follow-thru | chr0, ego, ent, sim, Earth) : this one seems very clearly overevaluated, by far. When players are done with a game, they don’t care anymore about the NPCs of the game. They never allow computing power to let the NPC runs after they are done playing. If the player realize that the NPC is self-aware, he could behave differently (but that implies knowing the ethics of the ones running the simulator), but then, why would he only grant immortality to those playing on his side ? If he’s ethical enough to spend computing power to allow NPCs to continue their lives, why not all of them ?
Also another point is missing : what if humans are meaningless for the people running the simulation ? The simulation’s goal, even if it’s entertainment, could be a Master of Orion game, with very distant galactic empires controlled by players, and Earth just being a minor NPC race, not yet discovered by any player (and which may even stay undiscovered till the end of the game). So we should add P(Earth is central| ego, ent, sim) which wouldn’t be that big to me, knowing that there are about 10^21 stars in observable universe.
At the end, well, I could give my own estimates, but there are so many wild guess about the estimates (especially since, if we’re in a simulation, we don’t know much about the rules of the “real” universe, so speculating about it is pointless), that I don’t see how the figure coming out such an estimate would have any real meaning.
How easy it is to simulate our universe, depends on how much computing power is available in the parent universe. Where do you get your prior for how big and complex universes usually are?
The simulation argument implies that we should reason the other way around, and assign prior expectations of the complexity of universes to be large enough that simulating our universe is not unusual.
The simulation argument itself is speculative enough to not be worth a 99% probability, it contains too many not so simple logical steps and assumptions that could go wrong to deserve a 99% probability. If you make one hundred of independent claims of the same magnitude and complexity of the simulation argument, more than one would contain a mistake (either one who could see now, or one that depends on things we are unconsciously assuming to be true) making the whole claim erroneous. Humans are bad at predicting the far future.
And then, the simulation argument only gives the choice between 3 possibilities, in which simulation is only one of the 3. It seems unreasonable to me to give only a 1% chance of the sum of the other two. The option 2 (trans-humans would not care or not want to run simulation of their ancestors) definitely deserve more than 1%, there is way too much uncertainty about what trans-humans will want and what their ethics would be.