What I had in mind isn’t a matter of manually changing your beliefs, but rather making accurate prediction whether or not you are in a simulated world (which is about to become distinct from “real” world), based on your knowledge about existence of such simulations. It could just as well be that you asked your friend, to simulate 1000 copies of you in that moment and having him teleport you to Hawaii as 11 AM strikes.
This problem is more interesting that I thought when I first read it (as Casebash). If you decide not to create the simulation, you are indifferent about having made the decision as you know that you are the original and that you were always going to have this experience. However, if you take this decision, then you are thankful that you did as otherwise there is a good chance that simulated you wouldn’t exist and be about to experience a beach.
Firstly, I’m not necessarily convinced that simulating a person necessarily results in consciousness, but that is largely irrelevant to this problem, as we can simply pretend that you are going to erase your memory 1000 times.
If you are going to simulate yourself 1000 times, then the chance, from your perspective, of being transported to Hawaii is 1000/1001. This calculation is correct, but it isn’t a paradox. Deciding to simulate yourself doesn’t change what will happen, there isn’t an objective probability that jumps from near 0 to 1000/1001. The 0 was produced under a model where you had no tendency to simulate this moment and the 1000/1001 was produced under a model where you are almost certain to simulate this moment. If an observer (with the same information you had at the start) could perfectly predict that you would make this decision to simulate, then they would report the 1000/1001 odds both before and after the decision. If they had 50% belief that you would make this decision before, then this would result in approx. 500/1001 odds before.
So, what is the paradox? If it is that you seem to be able to “warp” reality and so that you are almost certainly about to teleport to Hawaii, my answer explains that, if you are about to teleport, then it was always going to happen anyway. The simulation was already set up.
Or are you trying to make an anthropic argument? That if you make such a decision and then don’t appear in Hawaii that it is highly unlikely that you will be uploaded at some point? This is the sleeping beauty problem. I don’t 100% understand this yet.
What I had in mind isn’t a matter of manually changing your beliefs, but rather making accurate prediction whether or not you are in a simulated world (which is about to become distinct from “real” world), based on your knowledge about existence of such simulations. It could just as well be that you asked your friend, to simulate 1000 copies of you in that moment and having him teleport you to Hawaii as 11 AM strikes.
This problem is more interesting that I thought when I first read it (as Casebash). If you decide not to create the simulation, you are indifferent about having made the decision as you know that you are the original and that you were always going to have this experience. However, if you take this decision, then you are thankful that you did as otherwise there is a good chance that simulated you wouldn’t exist and be about to experience a beach.
Firstly, I’m not necessarily convinced that simulating a person necessarily results in consciousness, but that is largely irrelevant to this problem, as we can simply pretend that you are going to erase your memory 1000 times.
If you are going to simulate yourself 1000 times, then the chance, from your perspective, of being transported to Hawaii is 1000/1001. This calculation is correct, but it isn’t a paradox. Deciding to simulate yourself doesn’t change what will happen, there isn’t an objective probability that jumps from near 0 to 1000/1001. The 0 was produced under a model where you had no tendency to simulate this moment and the 1000/1001 was produced under a model where you are almost certain to simulate this moment. If an observer (with the same information you had at the start) could perfectly predict that you would make this decision to simulate, then they would report the 1000/1001 odds both before and after the decision. If they had 50% belief that you would make this decision before, then this would result in approx. 500/1001 odds before.
So, what is the paradox? If it is that you seem to be able to “warp” reality and so that you are almost certainly about to teleport to Hawaii, my answer explains that, if you are about to teleport, then it was always going to happen anyway. The simulation was already set up.
Or are you trying to make an anthropic argument? That if you make such a decision and then don’t appear in Hawaii that it is highly unlikely that you will be uploaded at some point? This is the sleeping beauty problem. I don’t 100% understand this yet.