I’ve always thought of SSA and SIA as assumptions that depend on what your goal is in trying to figure out the probability. Sleeping Beauty may want to maximize the probability that she guesses the coin correctly at least once, in which cases she should use the probability 1⁄2. Or she may want to maximize the number of correct guesses, in which case she should use the probability 2⁄3.
In either case, asking “but what’s the probability, really?” isn’t helpful.
Edit: in the second situation, Sleeping Beauty should use the probability 2⁄3 to figure out how to maximize the number of correct guesses. This doesn’t mean she should guess T 2⁄3 of the time—her answer also depends on the payouts, and in the simplest case (she gets $1 for every correct guess) she should be guessing T 100% of the time.
In either case, asking “but what’s the probability, really?” isn’t helpful.
Strongly agree. My paper here: http://arxiv.org/abs/1110.6437 takes the problem apart and considers the different components (utilities, probabilities, altruism towards other copies) that go into a decision, and shows you can reach the correct decision without worrying about the probabilities at all.
You’re wondering whether or not to donate to reduce existential risks. You won’t donate if you’re almost certain the world will end soon either way. You wake up as the 100 billionth person. Do you use this information to update on the probability that there will only be on the order of 100 billion people, and refrain from donating?
However, I’ve always had the feeling that people raising “it just depends on the utility function / bet / payoff” were mostly trying to salve egos wounded by having wrongly analyzed the problem. It’s instructive to consider utility, but don’t pretend to be confused about whether Beauty should be surprised to learn that the toss was H and not T.
You’re right. For that reason, I think my explanations in the follow-up comments were better than this first attempt (not that this post is incorrect, it just doesn’t quite address the main point). I’ve previously tried to say the same thing here and here. The opinion I have hasn’t changed, but maybe my way of expressing it has.
Probabilities are unique. They’re a branch of math. They depend on your information, but your motivations are usually “extraneous junk information.” And math still works the same even if you ask that it is really (What’s 2+2, really? 4).
Now, you could invent something else for the letters “probability” to mean, and define that to be 1⁄2 in the sleeping beauty problem, that’s fine. But that wouldn’t be some “other probability.” That would be some other “probability.”
EDIT: It appears that I thoroughly misunderstood Misha to be saying two wrong things—first that probability can be defined by maximizing different things depending on what you want (not what was said), and second that asking “but what’s the probability really?” isn’t helpful because I’m totally wrong about probabilities being unique. So, whoops.
What I’m saying is that there are two probabilities there, and they are both the correct probabilities, but they are the correct probabilities of different things. These different things seem like answers to the same question because the English language isn’t meant to deal with Sleeping Beauty type problems. But there is a difference, which I’ve done my best to explain.
Given that, is there anything your nitpicking actually addresses?
Sleeping Beauty may want to maximize the probability that she guesses the coin correctly at least once, in which cases she should use the probability 1⁄2. Or she may want to maximize the number of correct guesses, in which case she should use the probability 2⁄3.
That looks like two “probabilities” to me. Could you explain what the probabilities would be of, using the usual Bayesian understanding of “probability”?
I can try to rephrase what I said, but I honestly have no clue what you mean by putting probabilities in quotes.
2⁄3 is the probability that this Sleeping Beauty is waking up in a world where the coin came up tails. 1⁄2 is the probability that some Sleeping Beauty will wake up in such a world. To the naive reader, both of these things sound like “The probability that the coin comes up tails”.
I put the word “probability” in quotes is because I wanted to talk about the word itself, not the type of logic it refers to. The reason I thought you were talking about different types of logic using the same word was because probability already specifies what you’re supposed to be maximizing. For individual probabilities it could be one of many scoring rules, but if you want to add scores together you need to use the log scoring rule.
To the naive reader, both of these things sound like “The probability that the coin comes up tails”.
Right. One of them is the probability that the coin comes up tails given some starting information (as in a conditional probability, like P(T | S)), and the other is the probability that the coin comes up tails, given the starting information and some anthropic information: P(T | S A). So they’re both “P(T),” in a way.
Hah, so I think in your original comment you meant “asking “but what’s P(T), really?” isn’t helpful,” but I heard “asking “but what’s P(T | S A), really?” isn’t helpful” (in my defense, some people have actually said this).
If this is right I’ll edit it into my original reply so that people can be less confused. Lastly, in light of this there is only one thing I can link to.
I’ve always thought of SSA and SIA as assumptions that depend on what your goal is in trying to figure out the probability. Sleeping Beauty may want to maximize the probability that she guesses the coin correctly at least once, in which cases she should use the probability 1⁄2. Or she may want to maximize the number of correct guesses, in which case she should use the probability 2⁄3.
In either case, asking “but what’s the probability, really?” isn’t helpful.
Edit: in the second situation, Sleeping Beauty should use the probability 2⁄3 to figure out how to maximize the number of correct guesses. This doesn’t mean she should guess T 2⁄3 of the time—her answer also depends on the payouts, and in the simplest case (she gets $1 for every correct guess) she should be guessing T 100% of the time.
Strongly agree. My paper here: http://arxiv.org/abs/1110.6437 takes the problem apart and considers the different components (utilities, probabilities, altruism towards other copies) that go into a decision, and shows you can reach the correct decision without worrying about the probabilities at all.
You’re wondering whether or not to donate to reduce existential risks. You won’t donate if you’re almost certain the world will end soon either way. You wake up as the 100 billionth person. Do you use this information to update on the probability that there will only be on the order of 100 billion people, and refrain from donating?
I really like your explanations in this thread.
However, I’ve always had the feeling that people raising “it just depends on the utility function / bet / payoff” were mostly trying to salve egos wounded by having wrongly analyzed the problem. It’s instructive to consider utility, but don’t pretend to be confused about whether Beauty should be surprised to learn that the toss was H and not T.
You’re right. For that reason, I think my explanations in the follow-up comments were better than this first attempt (not that this post is incorrect, it just doesn’t quite address the main point). I’ve previously tried to say the same thing here and here. The opinion I have hasn’t changed, but maybe my way of expressing it has.
Probabilities are unique. They’re a branch of math. They depend on your information, but your motivations are usually “extraneous junk information.” And math still works the same even if you ask that it is really (What’s 2+2, really? 4).
Now, you could invent something else for the letters “probability” to mean, and define that to be 1⁄2 in the sleeping beauty problem, that’s fine. But that wouldn’t be some “other probability.” That would be some other “probability.”
EDIT: It appears that I thoroughly misunderstood Misha to be saying two wrong things—first that probability can be defined by maximizing different things depending on what you want (not what was said), and second that asking “but what’s the probability really?” isn’t helpful because I’m totally wrong about probabilities being unique. So, whoops.
What I’m saying is that there are two probabilities there, and they are both the correct probabilities, but they are the correct probabilities of different things. These different things seem like answers to the same question because the English language isn’t meant to deal with Sleeping Beauty type problems. But there is a difference, which I’ve done my best to explain.
Given that, is there anything your nitpicking actually addresses?
By “two probabilities” you mean this? :
That looks like two “probabilities” to me. Could you explain what the probabilities would be of, using the usual Bayesian understanding of “probability”?
I can try to rephrase what I said, but I honestly have no clue what you mean by putting probabilities in quotes.
2⁄3 is the probability that this Sleeping Beauty is waking up in a world where the coin came up tails. 1⁄2 is the probability that some Sleeping Beauty will wake up in such a world. To the naive reader, both of these things sound like “The probability that the coin comes up tails”.
Ah, okay, that makes sense to me now. Thanks.
I put the word “probability” in quotes is because I wanted to talk about the word itself, not the type of logic it refers to. The reason I thought you were talking about different types of logic using the same word was because probability already specifies what you’re supposed to be maximizing. For individual probabilities it could be one of many scoring rules, but if you want to add scores together you need to use the log scoring rule.
Right. One of them is the probability that the coin comes up tails given some starting information (as in a conditional probability, like P(T | S)), and the other is the probability that the coin comes up tails, given the starting information and some anthropic information: P(T | S A). So they’re both “P(T),” in a way.
Hah, so I think in your original comment you meant “asking “but what’s P(T), really?” isn’t helpful,” but I heard “asking “but what’s P(T | S A), really?” isn’t helpful” (in my defense, some people have actually said this).
If this is right I’ll edit it into my original reply so that people can be less confused. Lastly, in light of this there is only one thing I can link to.