I would say it’s extremely unclear to me that the question “what is your probability that you are agent X” in an anthropic question like this is meaningful and has a well-defined answer? You said “there are practical reasons you’d like to know”, but you haven’t actually concretely specified what will be done with the information.
In the process of looking for something I had previously read about this, I found the following post:
Which demonstrates why “what rent the belief is paying” is critical:
If Beauty’s bets about the coin get paid out once per experiment, she will do best by acting as if the probability is one half. If the bets get paid out once per awakening, acting as if the probability is one third has the best expected value.
Which says, to me, that the probability is not uniquely defined—in the sense that a probability is really a claim about what sort of bets you would take, but in this case the way the bet is structured around different individuals/worlds is what controls the apparent “probability” you should choose to bet with.
Ahh. I’m familiar with that case. Did having that in mind make you feel like there’s too much ambiguity in the question to really want to dig into it. I wasn’t considering that sort of scenario (“they need to know their position” rules it out), but I can see why it would have come to mind.
I would say it’s extremely unclear to me that the question “what is your probability that you are agent X” in an anthropic question like this is meaningful and has a well-defined answer? You said “there are practical reasons you’d like to know”, but you haven’t actually concretely specified what will be done with the information.
In the process of looking for something I had previously read about this, I found the following post:
https://www.lesswrong.com/posts/y7jZ9BLEeuNTzgAE5/the-anthropic-trilemma
Which seems to be asking a very similar question to the one you’re considering. (It mentions Ebborians, but postdates that post significantly.)
I then found the thing I was actually looking for: https://www.lesswrong.com/tag/sleeping-beauty-paradox
Which demonstrates why “what rent the belief is paying” is critical:
Which says, to me, that the probability is not uniquely defined—in the sense that a probability is really a claim about what sort of bets you would take, but in this case the way the bet is structured around different individuals/worlds is what controls the apparent “probability” you should choose to bet with.
Ahh. I’m familiar with that case. Did having that in mind make you feel like there’s too much ambiguity in the question to really want to dig into it. I wasn’t considering that sort of scenario (“they need to know their position” rules it out), but I can see why it would have come to mind.
You might find this removed part relevant https://www.lesswrong.com/posts/gx6GEnpLkTXn3NFSS/we-need-a-theory-of-anthropic-measure-binding?commentId=mwuquFJHNCiYZFwzg
It acknowledges that some variants of the question can have that quality of.. not really needing to know their position.
I’m going to have to think about editing that stuff back in.