I had considered making a post about the Sleeping Beauty Problem, but I’ll just latch on to yours. I thought a modification of the original problem would be more interesting: As before, Sleeping Beauty is put to sleep, and if woken up and questioned, is put to sleep with amnesia. As before, a coinflip determines whether to wake her up once or twice. But the new bit is that instead of being guaranteed to wake up, a die is rolled once each day that she might have been woken up. This time, she can in fact use the fact of being awake as evidence of the coinflip, and the strength of the evidence would depend on the probability determined with the die. Eg if she knew she had 1⁄3 chance to be woken up each day, the probability of the coinflip is 5⁄9 : 1⁄3 in favor of two days, based on her knowledge that she was awoken at least once.
In anthropic terms, it means that our existence is evidence that the world is such that our existence is likely. If your reward structure were such that your reward is multiplied by the number of observers that agree with you (as with multiple copies of yourself), you’d be best off overestimating the number of observers, but I don’t see any way that would be useful in practice.
I had considered making a post about the Sleeping Beauty Problem, but I’ll just latch on to yours. I thought a modification of the original problem would be more interesting: As before, Sleeping Beauty is put to sleep, and if woken up and questioned, is put to sleep with amnesia. As before, a coinflip determines whether to wake her up once or twice. But the new bit is that instead of being guaranteed to wake up, a die is rolled once each day that she might have been woken up. This time, she can in fact use the fact of being awake as evidence of the coinflip, and the strength of the evidence would depend on the probability determined with the die. Eg if she knew she had 1⁄3 chance to be woken up each day, the probability of the coinflip is 5⁄9 : 1⁄3 in favor of two days, based on her knowledge that she was awoken at least once.
In anthropic terms, it means that our existence is evidence that the world is such that our existence is likely. If your reward structure were such that your reward is multiplied by the number of observers that agree with you (as with multiple copies of yourself), you’d be best off overestimating the number of observers, but I don’t see any way that would be useful in practice.