Let’s consider a similar problem. Suppose you’ve just discovered that you’ve got cancer. You decide to buy a pill that would erase your memory of diagnosis. From your new perspective, your chance of having cancer will be 2%. In this situation, you can change your knowledge about your odds of having cancer or not, but you can’t change whether you actually have cancer, or reduce the probability of someone with your symptoms having cancer.
What seems to make this situation paradoxical is that when you make a decision, it seems to change the probability that you had before you made the decision. This isn’t quite what happens. If you accept determinism, you were always going to make that decision. If you had known that you were going to make this decision before you actually made it, then your probability estimate wouldn’t have changed when you made the decision. The reason why the probability changes is that you have gained an additional piece of information, that you are the kind of person to make a vow to simulate yourself. You might have assigned a smaller probability to the chance that you were such a person before you decided to actually do it.
What I had in mind isn’t a matter of manually changing your beliefs, but rather making accurate prediction whether or not you are in a simulated world (which is about to become distinct from “real” world), based on your knowledge about existence of such simulations. It could just as well be that you asked your friend, to simulate 1000 copies of you in that moment and having him teleport you to Hawaii as 11 AM strikes.
This problem is more interesting that I thought when I first read it (as Casebash). If you decide not to create the simulation, you are indifferent about having made the decision as you know that you are the original and that you were always going to have this experience. However, if you take this decision, then you are thankful that you did as otherwise there is a good chance that simulated you wouldn’t exist and be about to experience a beach.
Firstly, I’m not necessarily convinced that simulating a person necessarily results in consciousness, but that is largely irrelevant to this problem, as we can simply pretend that you are going to erase your memory 1000 times.
If you are going to simulate yourself 1000 times, then the chance, from your perspective, of being transported to Hawaii is 1000/1001. This calculation is correct, but it isn’t a paradox. Deciding to simulate yourself doesn’t change what will happen, there isn’t an objective probability that jumps from near 0 to 1000/1001. The 0 was produced under a model where you had no tendency to simulate this moment and the 1000/1001 was produced under a model where you are almost certain to simulate this moment. If an observer (with the same information you had at the start) could perfectly predict that you would make this decision to simulate, then they would report the 1000/1001 odds both before and after the decision. If they had 50% belief that you would make this decision before, then this would result in approx. 500/1001 odds before.
So, what is the paradox? If it is that you seem to be able to “warp” reality and so that you are almost certainly about to teleport to Hawaii, my answer explains that, if you are about to teleport, then it was always going to happen anyway. The simulation was already set up.
Or are you trying to make an anthropic argument? That if you make such a decision and then don’t appear in Hawaii that it is highly unlikely that you will be uploaded at some point? This is the sleeping beauty problem. I don’t 100% understand this yet.
Let’s consider a similar problem. Suppose you’ve just discovered that you’ve got cancer. You decide to buy a pill that would erase your memory of diagnosis. From your new perspective, your chance of having cancer will be 2%. In this situation, you can change your knowledge about your odds of having cancer or not, but you can’t change whether you actually have cancer, or reduce the probability of someone with your symptoms having cancer.
What seems to make this situation paradoxical is that when you make a decision, it seems to change the probability that you had before you made the decision. This isn’t quite what happens. If you accept determinism, you were always going to make that decision. If you had known that you were going to make this decision before you actually made it, then your probability estimate wouldn’t have changed when you made the decision. The reason why the probability changes is that you have gained an additional piece of information, that you are the kind of person to make a vow to simulate yourself. You might have assigned a smaller probability to the chance that you were such a person before you decided to actually do it.
What I had in mind isn’t a matter of manually changing your beliefs, but rather making accurate prediction whether or not you are in a simulated world (which is about to become distinct from “real” world), based on your knowledge about existence of such simulations. It could just as well be that you asked your friend, to simulate 1000 copies of you in that moment and having him teleport you to Hawaii as 11 AM strikes.
This problem is more interesting that I thought when I first read it (as Casebash). If you decide not to create the simulation, you are indifferent about having made the decision as you know that you are the original and that you were always going to have this experience. However, if you take this decision, then you are thankful that you did as otherwise there is a good chance that simulated you wouldn’t exist and be about to experience a beach.
Firstly, I’m not necessarily convinced that simulating a person necessarily results in consciousness, but that is largely irrelevant to this problem, as we can simply pretend that you are going to erase your memory 1000 times.
If you are going to simulate yourself 1000 times, then the chance, from your perspective, of being transported to Hawaii is 1000/1001. This calculation is correct, but it isn’t a paradox. Deciding to simulate yourself doesn’t change what will happen, there isn’t an objective probability that jumps from near 0 to 1000/1001. The 0 was produced under a model where you had no tendency to simulate this moment and the 1000/1001 was produced under a model where you are almost certain to simulate this moment. If an observer (with the same information you had at the start) could perfectly predict that you would make this decision to simulate, then they would report the 1000/1001 odds both before and after the decision. If they had 50% belief that you would make this decision before, then this would result in approx. 500/1001 odds before.
So, what is the paradox? If it is that you seem to be able to “warp” reality and so that you are almost certainly about to teleport to Hawaii, my answer explains that, if you are about to teleport, then it was always going to happen anyway. The simulation was already set up.
Or are you trying to make an anthropic argument? That if you make such a decision and then don’t appear in Hawaii that it is highly unlikely that you will be uploaded at some point? This is the sleeping beauty problem. I don’t 100% understand this yet.