“I know the probabilities involved and they are 50% for X and 50% for Y” and “I don’t know”.
Could we further distinguish between
a uniform distribution on the 0 to 1 range and “I don’t know”?
Let’s say a biased coin with unknown probability of landing heads is tossed, p is uniform on (0,1) and “I don’t know” means you can’t predict better than randomly guessing. So saying p is 50% doesn’t matter because it doesn’t beat random.
But what if we tossed the coin twice, and I had you guess twice, before the tosses. If you get at least one guess correct then you get to keep your life. Assuming you want to play to keep your life, then how would you play? Coin is still p uniform on (0,1), but it seems like “I don’t know” doesn’t mean the same thing anymore, because you can play in a way that can better predict the outcome of keeping your life.
You would guess (H,T) or (T,H) but avoid randomly guessing because it would produce things like (H,H) which is really bad because if p is uniform on (0,1), then probability of heads is 90% is just as likely as probability of heads is 10%, but heads at 10% is really bad for (H,H), so bad that even 90% heads doesn’t really help that much more.
If p is 90% or 10%, guessing (H,T) or (T,H) would result in the same small probability of dying at 9%. But (H,H) would result in at best 1% or 81% chance of dying. Saying I don’t know in this scenario doesn’t feel the same as I don’t know in the first scenario. I am probably confused.
but it seems like “I don’t know” doesn’t mean the same thing anymore, because you can play in a way that can better predict the outcome of keeping your life.
But you’ve changed things :-) In your situation you know a very important thing: that the probability p is the same for both throws. That is useful information which allows you to do some probability math (specifically compare 1 - p(1-p) and 1 - p^2).
But let’s say you don’t toss the same coin twice, but you toss two different coins. Does guessing (H,T) help now?
So if we can distinguish between
Could we further distinguish between
Let’s say a biased coin with unknown probability of landing heads is tossed, p is uniform on (0,1) and “I don’t know” means you can’t predict better than randomly guessing. So saying p is 50% doesn’t matter because it doesn’t beat random.
But what if we tossed the coin twice, and I had you guess twice, before the tosses. If you get at least one guess correct then you get to keep your life. Assuming you want to play to keep your life, then how would you play? Coin is still p uniform on (0,1), but it seems like “I don’t know” doesn’t mean the same thing anymore, because you can play in a way that can better predict the outcome of keeping your life.
You would guess (H,T) or (T,H) but avoid randomly guessing because it would produce things like (H,H) which is really bad because if p is uniform on (0,1), then probability of heads is 90% is just as likely as probability of heads is 10%, but heads at 10% is really bad for (H,H), so bad that even 90% heads doesn’t really help that much more.
If p is 90% or 10%, guessing (H,T) or (T,H) would result in the same small probability of dying at 9%. But (H,H) would result in at best 1% or 81% chance of dying. Saying I don’t know in this scenario doesn’t feel the same as I don’t know in the first scenario. I am probably confused.
But you’ve changed things :-) In your situation you know a very important thing: that the probability p is the same for both throws. That is useful information which allows you to do some probability math (specifically compare 1 - p(1-p) and 1 - p^2).
But let’s say you don’t toss the same coin twice, but you toss two different coins. Does guessing (H,T) help now?
I understand now. Thanks!