Can someone come up with a situation of the same general form as this one where anthropic reasoning results in optimal actions and nonanthropic reasoning results in suboptimal actions?
How about if the wager is that anybody in any room can guess the outcome of the coinflip, and if they get it right they win 1$ and if they get it wrong they lose 2$?
If you still think it’s 50% after waking up in a green room, you won’t take the bet, and you’ll win 0$, if you think it’s 90% you’ll take the bet and come out 14$ ahead on balance, with two of you losing 2$ each and 18 of you getting $1.
Doesn’t this show anthropic reasoning is right as much as the OP shows it’s wrong?
Can someone come up with a situation of the same general form as this one where anthropic reasoning results in optimal actions and nonanthropic reasoning results in suboptimal actions?
How about if the wager is that anybody in any room can guess the outcome of the coinflip, and if they get it right they win 1$ and if they get it wrong they lose 2$?
If you still think it’s 50% after waking up in a green room, you won’t take the bet, and you’ll win 0$, if you think it’s 90% you’ll take the bet and come out 14$ ahead on balance, with two of you losing 2$ each and 18 of you getting $1.
Doesn’t this show anthropic reasoning is right as much as the OP shows it’s wrong?