I also get “stop after two losses,” although my numbers come out slightly differently. However, I suck at this sort of problem, so it’s quite likely I’ve got it wrong.
My temptation would be to solve it numerically (by brute force), i.e. code up a simulation and run it a million times and get the answer by seeing which strategy does best. Often that’s the right approach. However, sometimes you can’t simulate, and an analytical (exact, a priori) answer is better.
I think you are right about the sportsball case! I’ve updated my meta-meta-probability curve accordingly :-)
Can you think of a better example, in which the curve ought to be dead flat?
Jaynes uses “the probability that there was once life on Mars” in his discussion of this. I’m not sure that’s such a great example either.
I think you are right about the sportsball case! I’ve updated my meta-meta-probability curve accordingly :-)
The wikipedia article on the Beta distribution has a good discussion of possible priors to use. The Jeffreys prior is probably the one I’d use for Sportsball, but the Bayes-Laplace prior is generally acceptable as a representation of ignorance.
The example I like to give is the uncertain digital coin- I generate some double p between 0 and 1 using a random number generator, and then write a function “flip” which generates another double, and compares it to p. This is analogous to your blue box, and if you’re confident in the RNG means you have a tight meta-meta-probability curve, which justifies the uniform prior.
Jaynes uses “the probability that there was once life on Mars” in his discussion of this. I’m not sure that’s such a great example either.
Yeah, that seems like a good candidate for the Haldane prior to me.
Glad you liked it!
I also get “stop after two losses,” although my numbers come out slightly differently. However, I suck at this sort of problem, so it’s quite likely I’ve got it wrong.
My temptation would be to solve it numerically (by brute force), i.e. code up a simulation and run it a million times and get the answer by seeing which strategy does best. Often that’s the right approach. However, sometimes you can’t simulate, and an analytical (exact, a priori) answer is better.
I think you are right about the sportsball case! I’ve updated my meta-meta-probability curve accordingly :-)
Can you think of a better example, in which the curve ought to be dead flat?
Jaynes uses “the probability that there was once life on Mars” in his discussion of this. I’m not sure that’s such a great example either.
The wikipedia article on the Beta distribution has a good discussion of possible priors to use. The Jeffreys prior is probably the one I’d use for Sportsball, but the Bayes-Laplace prior is generally acceptable as a representation of ignorance.
The example I like to give is the uncertain digital coin- I generate some double p between 0 and 1 using a random number generator, and then write a function “flip” which generates another double, and compares it to p. This is analogous to your blue box, and if you’re confident in the RNG means you have a tight meta-meta-probability curve, which justifies the uniform prior.
Yeah, that seems like a good candidate for the Haldane prior to me.