Rather than “I am a person,” let″s substitute “I am painted green.”
Suppose we start out with ten people, none of them painted green.
A coin is flipped. If heads, one person is painted green. If tails, nine people are painted green.
If you observe that you have been painted green, what is your probability that the landed heads? Bayes rule time!
P(heads | green) = P(heads) * P(green | heads) / P(green) = 0.5 * 0.9 / 0.5 = 0.9. Observing that you have been painted green, you conclude that the coin is more likely to be heads. Simple Bayesian updating.
In this simple problem, upon learning that you have been painted green, you give equal weight to each green person, weighted by the prior probability of the coin.
Could you elaborate on the implications of that statement? I’m not following what you’re trying to say.
Rather than “I am a person,” let″s substitute “I am painted green.”
Suppose we start out with ten people, none of them painted green.
A coin is flipped. If heads, one person is painted green. If tails, nine people are painted green.
If you observe that you have been painted green, what is your probability that the landed heads? Bayes rule time!
P(heads | green) = P(heads) * P(green | heads) / P(green) = 0.5 * 0.9 / 0.5 = 0.9. Observing that you have been painted green, you conclude that the coin is more likely to be heads. Simple Bayesian updating.
In this simple problem, upon learning that you have been painted green, you give equal weight to each green person, weighted by the prior probability of the coin.