This is a really fascinating idea, particularly the aspect that we can influence the likelihood we are in a simulation by making it more likely that simulations happen.
To boil it down to a simple thought experiment. Suppose I am in the future where we have a ton of computing power and I know something bad will happen tomorrow (say I’ll be fired) barring some 1/1000 likelihood quantum event. No problem, I’ll just make millions of simulations of the world with me in my current state so that tomorrow the 1/1000 event happens and I’m saved since I’m almost certainly in one of these simulations I’m about to make!
This is a really fascinating idea, particularly the aspect that we can influence the likelihood we are in a simulation by making it more likely that simulations happen.
Maybe? We can increase our credence, but I think whether or not it increases the likelihood is an open question. The intuitions seem to split between two-boxers and a subset of one-boxers.
That said, thank you for the secondary thought experiment, which is really interesting.
This is a really fascinating idea, particularly the aspect that we can influence the likelihood we are in a simulation by making it more likely that simulations happen.
To boil it down to a simple thought experiment. Suppose I am in the future where we have a ton of computing power and I know something bad will happen tomorrow (say I’ll be fired) barring some 1/1000 likelihood quantum event. No problem, I’ll just make millions of simulations of the world with me in my current state so that tomorrow the 1/1000 event happens and I’m saved since I’m almost certainly in one of these simulations I’m about to make!
Maybe? We can increase our credence, but I think whether or not it increases the likelihood is an open question. The intuitions seem to split between two-boxers and a subset of one-boxers.
That said, thank you for the secondary thought experiment, which is really interesting.