if one started with no particular expectation of it being one bucket vs the other, ie, assigned 1:1 odds, then after updating upon seeing a green marble, one ought assign 9:1 odds, ie, probability 9⁄10, right?
because P(green) is not the probability that you will get a green
marble, it’s the probability that someone will get a green marble. From
the perspective of the priors, all the marbles are drawn, and no one
draw is different from any other. If you don’t draw a green marble,
you’re discarded and the people who did get a green vote. For the
purposes of figuring out the priors for a group strategy, your draw
being green is not an event.
Of course, you know that you’ve drawn green. But the only thing you can
translate it into that has a prior is “someone got green.”
That probably sounds contrived. Maybe it is. But consider a slightly
different example:
Two marbles and two people instead of twenty.
One marble is green, the other will be red or green based on a coin
flip (green on heads, red on tails).
I like this example because it combines the two conflicting intuitions
in the same problem. Only a fool would draw a red marble and remain
uncertain about the coin flip. But someone who draws a green marble is
in a situation similar to the twenty marble scenario.
If you were to plan ahead of time how the greens should vote, you would
tell them to assume 50%. But a person holding a green marble
might think it’s 2⁄3 in favor of double green.
To avoid embarrassing paradoxes, you can base everything on the four
events “heads,” “tails,” “someone gets green,” and “someone gets red.”
Update as normal.
yes, the probability that someone will get a green marble is rather different than the probability that I, personally, will get a green marble. But if I do personally get a green marble, that’s evidence in favor of green bucket.
The decision algorithm for how to respond to that though in this case is skewed due to the rules for the payout.
And in your example, if I drew green, I’d consider the 2⁄3 probability the correct one for whoever drew green.
Now, if there’s a payout scheme involved with funny business, that may alter some decisions, but not magically change my epistemology.
Um… okay… I’m not sure what we’re disagreeing about here, if anything:
my position is “given that I found myself with a green marble, it is right and proper for me to assign a 2⁄3 probability to both being green. However, the correct choice to make, given the pecuiluarities of this specific problem, may require one to make a decision that seems, on the surface, as if one didn’t update like that at all.”
Well, we might be saying the same thing but coming from different points of view about what it means. I’m not actually a bayesian, so when I talk about assigning probabilities and updating them, I just mean doing equations.
What I’m saying here is that you should set up the equations in a way that reflects the group’s point of view because you’re telling the group what to do. That involves plugging some probabilities of one into Bayes’ Law and getting a final answer equal to one of the starting numbers.
I don’t think so. I think the answer to both these problems is that if you update correctly, you get 0.5.
*blinks* mind expanding on that?
P(green|mostly green bucket) = 18⁄20
P(green|mostly red bucket) = 2⁄20
likelihood ratio = 9
if one started with no particular expectation of it being one bucket vs the other, ie, assigned 1:1 odds, then after updating upon seeing a green marble, one ought assign 9:1 odds, ie, probability 9⁄10, right?
I guess that does need a lot of explaining.
I would say:
P(green|mostly green bucket) = 1
P(green|mostly red bucket) = 1
P(green) = 1
because P(green) is not the probability that you will get a green marble, it’s the probability that someone will get a green marble. From the perspective of the priors, all the marbles are drawn, and no one draw is different from any other. If you don’t draw a green marble, you’re discarded and the people who did get a green vote. For the purposes of figuring out the priors for a group strategy, your draw being green is not an event.
Of course, you know that you’ve drawn green. But the only thing you can translate it into that has a prior is “someone got green.”
That probably sounds contrived. Maybe it is. But consider a slightly different example:
Two marbles and two people instead of twenty.
One marble is green, the other will be red or green based on a coin flip (green on heads, red on tails).
I like this example because it combines the two conflicting intuitions in the same problem. Only a fool would draw a red marble and remain uncertain about the coin flip. But someone who draws a green marble is in a situation similar to the twenty marble scenario.
If you were to plan ahead of time how the greens should vote, you would tell them to assume 50%. But a person holding a green marble might think it’s 2⁄3 in favor of double green.
To avoid embarrassing paradoxes, you can base everything on the four events “heads,” “tails,” “someone gets green,” and “someone gets red.” Update as normal.
yes, the probability that someone will get a green marble is rather different than the probability that I, personally, will get a green marble. But if I do personally get a green marble, that’s evidence in favor of green bucket.
The decision algorithm for how to respond to that though in this case is skewed due to the rules for the payout.
And in your example, if I drew green, I’d consider the 2⁄3 probability the correct one for whoever drew green.
Now, if there’s a payout scheme involved with funny business, that may alter some decisions, but not magically change my epistemology.
What kind of funny business?
Let’s just say that you don’t draw blue.
OK, but I think Psy-Kosh was talking about something to do with the payoffs. I’m just not sure if he means the voting or the dollar amounts or what.
Sorry for delay. And yeah, I meant stuff like “only greens get to decide, and the decision needs to be unanimous” and so on
I agree that changes the answer. I was assuming a scheme like that in my two marble example. In a more typical situation, I would also say 2⁄3.
To me, it’s not a drastic (or magical) change, just getting a different answer to a different question.
Um… okay… I’m not sure what we’re disagreeing about here, if anything:
my position is “given that I found myself with a green marble, it is right and proper for me to assign a 2⁄3 probability to both being green. However, the correct choice to make, given the pecuiluarities of this specific problem, may require one to make a decision that seems, on the surface, as if one didn’t update like that at all.”
Well, we might be saying the same thing but coming from different points of view about what it means. I’m not actually a bayesian, so when I talk about assigning probabilities and updating them, I just mean doing equations.
What I’m saying here is that you should set up the equations in a way that reflects the group’s point of view because you’re telling the group what to do. That involves plugging some probabilities of one into Bayes’ Law and getting a final answer equal to one of the starting numbers.
So was I. But fortunately I was restrained enough to temper my uncouth humour with obscurity.