I don’t think that I need to think about referential classes at all. I can just notice that I’m in a state of uncertanity between two outcomes and as there is no reason to think that any specific one is more likely than the other I use the equiprobable prior.
I believe the ridiculousness of antropics comes when the model assumes that I’m randomly selected from a distribution, while in reality it’s not actually the case. But sometimes it may still be true. So there are situations when self-locating probability is valid and situations when it’s not.
I think my intuition pump is this:
If I’m separated in ten people 9 of whom are going to wake up in the red room while 1 is going to wake up in the blue room it’s correct to have 9:1 odds in favour of red for my expected experience. Because I would actually be one of these 10 people.
But if a fair coin is tossed and I’m separated in 9 people who will wake up in red rooms if its heads or I’ll wake up in a blue room if it’s tails then there odds are 1:1 because the causal process is completely different. I am either one of nine people or one of one based on the results of the coin toss, not the equiprobable distribution.
Also none of these cases include “updating from existence/waking up”. I was expected to be existing anyway and got no new information.
I don’t think that I need to think about referential classes at all. I can just notice that I’m in a state of uncertanity between two outcomes and as there is no reason to think that any specific one is more likely than the other I use the equiprobable prior.
I believe the ridiculousness of antropics comes when the model assumes that I’m randomly selected from a distribution, while in reality it’s not actually the case. But sometimes it may still be true. So there are situations when self-locating probability is valid and situations when it’s not.
I think my intuition pump is this:
If I’m separated in ten people 9 of whom are going to wake up in the red room while 1 is going to wake up in the blue room it’s correct to have 9:1 odds in favour of red for my expected experience. Because I would actually be one of these 10 people.
But if a fair coin is tossed and I’m separated in 9 people who will wake up in red rooms if its heads or I’ll wake up in a blue room if it’s tails then there odds are 1:1 because the causal process is completely different. I am either one of nine people or one of one based on the results of the coin toss, not the equiprobable distribution.
Also none of these cases include “updating from existence/waking up”. I was expected to be existing anyway and got no new information.