yes, the probability that someone will get a green marble is rather different than the probability that I, personally, will get a green marble. But if I do personally get a green marble, that’s evidence in favor of green bucket.
The decision algorithm for how to respond to that though in this case is skewed due to the rules for the payout.
And in your example, if I drew green, I’d consider the 2⁄3 probability the correct one for whoever drew green.
Now, if there’s a payout scheme involved with funny business, that may alter some decisions, but not magically change my epistemology.
Um… okay… I’m not sure what we’re disagreeing about here, if anything:
my position is “given that I found myself with a green marble, it is right and proper for me to assign a 2⁄3 probability to both being green. However, the correct choice to make, given the pecuiluarities of this specific problem, may require one to make a decision that seems, on the surface, as if one didn’t update like that at all.”
Well, we might be saying the same thing but coming from different points of view about what it means. I’m not actually a bayesian, so when I talk about assigning probabilities and updating them, I just mean doing equations.
What I’m saying here is that you should set up the equations in a way that reflects the group’s point of view because you’re telling the group what to do. That involves plugging some probabilities of one into Bayes’ Law and getting a final answer equal to one of the starting numbers.
yes, the probability that someone will get a green marble is rather different than the probability that I, personally, will get a green marble. But if I do personally get a green marble, that’s evidence in favor of green bucket.
The decision algorithm for how to respond to that though in this case is skewed due to the rules for the payout.
And in your example, if I drew green, I’d consider the 2⁄3 probability the correct one for whoever drew green.
Now, if there’s a payout scheme involved with funny business, that may alter some decisions, but not magically change my epistemology.
What kind of funny business?
Let’s just say that you don’t draw blue.
OK, but I think Psy-Kosh was talking about something to do with the payoffs. I’m just not sure if he means the voting or the dollar amounts or what.
Sorry for delay. And yeah, I meant stuff like “only greens get to decide, and the decision needs to be unanimous” and so on
I agree that changes the answer. I was assuming a scheme like that in my two marble example. In a more typical situation, I would also say 2⁄3.
To me, it’s not a drastic (or magical) change, just getting a different answer to a different question.
Um… okay… I’m not sure what we’re disagreeing about here, if anything:
my position is “given that I found myself with a green marble, it is right and proper for me to assign a 2⁄3 probability to both being green. However, the correct choice to make, given the pecuiluarities of this specific problem, may require one to make a decision that seems, on the surface, as if one didn’t update like that at all.”
Well, we might be saying the same thing but coming from different points of view about what it means. I’m not actually a bayesian, so when I talk about assigning probabilities and updating them, I just mean doing equations.
What I’m saying here is that you should set up the equations in a way that reflects the group’s point of view because you’re telling the group what to do. That involves plugging some probabilities of one into Bayes’ Law and getting a final answer equal to one of the starting numbers.
So was I. But fortunately I was restrained enough to temper my uncouth humour with obscurity.