I agree with Thomas—even if this proved that thirdism is right when you are planning to do this, it would not prove that it is right if you are not planning to do this. In fact it suggests the opposite: since the update is necessary, thirdism is false without the update.
The following principle seems plausible to me: creating any weird situation X outside the experiment shouldn’t affect my beliefs, if I can verify that I’m in the experiment and not in situation X. Disagreeing with that principle seems like a big bullet to bite, but maybe that’s just because I haven’t found any X that would lead to anything except thirdism (and I’ve tried). It’s certainly fair to scrutinize the idea because it’s new, and I’d love to learn about any strange implications.
“The next morning, I wake up not knowing whether I’m still in the experiment or not. ”
By creating a situation outside the experiment which is originally indistinct from being in the experiment, you affect how the experiment should be evaluated. The same thing is true, for example, if the whole experiment is done multiple times rather than only once.
Yeah, if the whole experiment is done twice, and you’re truthfully told “this is the first experiment” or “this is the second experiment” at the beginning of each day (a minute after waking up), then I think your reasoning in the first experiment (an hour after waking up) should be the same as though the second experiment didn’t exist. Having had a minute of confusion in your past should be irrelevant.
I disagree. I have presented arguments on LW in the past that if the experiment is run once in the history of the universe, you should reason as a halfer, but if the experiment is run many times, you will assign a probability in between 1⁄2 and 1⁄3, approaching one third as the number of times approaches infinity. I think that this applies even if you know the numerical identity of your particular run.
Actually, I was probably mistaken. I think I was thinking of this post and in particular this thread and this one. (I was previously using the username “Unknowns”.)
I think I confused this with Sleeping Beauty because of the similarity of Incubator situations with Sleeping Beauty. I’ll have to think about it but I suspect there will be similar results.
I agree with Thomas—even if this proved that thirdism is right when you are planning to do this, it would not prove that it is right if you are not planning to do this. In fact it suggests the opposite: since the update is necessary, thirdism is false without the update.
The following principle seems plausible to me: creating any weird situation X outside the experiment shouldn’t affect my beliefs, if I can verify that I’m in the experiment and not in situation X. Disagreeing with that principle seems like a big bullet to bite, but maybe that’s just because I haven’t found any X that would lead to anything except thirdism (and I’ve tried). It’s certainly fair to scrutinize the idea because it’s new, and I’d love to learn about any strange implications.
“The next morning, I wake up not knowing whether I’m still in the experiment or not. ”
By creating a situation outside the experiment which is originally indistinct from being in the experiment, you affect how the experiment should be evaluated. The same thing is true, for example, if the whole experiment is done multiple times rather than only once.
Yeah, if the whole experiment is done twice, and you’re truthfully told “this is the first experiment” or “this is the second experiment” at the beginning of each day (a minute after waking up), then I think your reasoning in the first experiment (an hour after waking up) should be the same as though the second experiment didn’t exist. Having had a minute of confusion in your past should be irrelevant.
I disagree. I have presented arguments on LW in the past that if the experiment is run once in the history of the universe, you should reason as a halfer, but if the experiment is run many times, you will assign a probability in between 1⁄2 and 1⁄3, approaching one third as the number of times approaches infinity. I think that this applies even if you know the numerical identity of your particular run.
Interesting! I was away from LW for a long time and probably missed it. Can you give a link, or sketch the argument here?
Actually, I was probably mistaken. I think I was thinking of this post and in particular this thread and this one. (I was previously using the username “Unknowns”.)
I think I confused this with Sleeping Beauty because of the similarity of Incubator situations with Sleeping Beauty. I’ll have to think about it but I suspect there will be similar results.