This post is very much in accordance with my experience. I’ve never been able to develop any non-frequentist intuitions about probability, and even simple problems sometimes confuse me until I translate them into explicit frequentist terms. However, once I have the reference classes clearly defined and sketched, I have no difficulty following complex arguments and solving reasonably hard problems in probability. (This includes the numerous supposed paradoxes that disappear as soon as the problem is stated in clear frequentist terms.)
Moreover, I’m still at a loss to understand what meaning the numerical values of probabilities could have except for the frequentist ratios that they imply. I raised the question in a recent discussion here, but I didn’t get any satisfactory answers.
You watch someone flip a coin a hundred times. After a while, you get your frequentist sense of the probability that it will come up heads.
Then somebody takes a small, flat square piece of metal, writes “heads” on one side. Before flipping it, he asks you: “What’s the chance it’s going to come up ‘heads’ 100 times in a row?”
Would you say, “I have no idea?”
If you said, “Well, very unlikely, obviously”, what makes it so obvious to you? What’s your degree of certainty about each statement in your line of reasoning? And where did those degrees of certainty come from?
Sure, all sorts of past “reference classes” and analogous events might turn up as we discussed your reasoning. But the fact would still remain, most people, whether asked about that coin or asked about that small flat square piece of metal, will give you an answer that’s pretty inaccurate if you ask them how likely it is that it will come up heads five times in a row, no matter whether you asked in frequentist terms or Bayesian terms.
When it comes to assessing the chance of a certain series of independent events, bias of some kind does seem to enter. This is probably (um, heh) because, although we might be fairly frequentist when it comes to notably and frequent events, we don’t naturally note any given series of independent events as an event in itself. (For one thing, the sheer combinatorics prevent even encountering many such series.)
I wouldn’t be surprised if the ultimate synthesis shows our brains are effectively frequentist (even almost optimally so) when it comes the sorts of situations they evolved under, but also that these evolved optimizations break under conditions found in our increasingly artificial world. One does not find things much like coins in nature, nor much reason for people to use their computed fairness to resolve issues.
If you said, “Well, very unlikely, obviously”, what makes it so obvious to you? What’s your degree of certainty about each statement in your line of reasoning? And where did those degrees of certainty come from?
Haha, a similar argument could justify numerical “degrees of tastiness” for sandwiches.
I’m not trying to justify numerical degrees of anything in the human brain. We’re talking models here, not physiccal reality. If you can’t see that, cousin it, well, you can go squeak somewhere else in the Addams family mansion.
This post is very much in accordance with my experience. I’ve never been able to develop any non-frequentist intuitions about probability, and even simple problems sometimes confuse me until I translate them into explicit frequentist terms. However, once I have the reference classes clearly defined and sketched, I have no difficulty following complex arguments and solving reasonably hard problems in probability. (This includes the numerous supposed paradoxes that disappear as soon as the problem is stated in clear frequentist terms.)
Moreover, I’m still at a loss to understand what meaning the numerical values of probabilities could have except for the frequentist ratios that they imply. I raised the question in a recent discussion here, but I didn’t get any satisfactory answers.
You watch someone flip a coin a hundred times. After a while, you get your frequentist sense of the probability that it will come up heads.
Then somebody takes a small, flat square piece of metal, writes “heads” on one side. Before flipping it, he asks you: “What’s the chance it’s going to come up ‘heads’ 100 times in a row?”
Would you say, “I have no idea?”
If you said, “Well, very unlikely, obviously”, what makes it so obvious to you? What’s your degree of certainty about each statement in your line of reasoning? And where did those degrees of certainty come from?
Sure, all sorts of past “reference classes” and analogous events might turn up as we discussed your reasoning. But the fact would still remain, most people, whether asked about that coin or asked about that small flat square piece of metal, will give you an answer that’s pretty inaccurate if you ask them how likely it is that it will come up heads five times in a row, no matter whether you asked in frequentist terms or Bayesian terms.
When it comes to assessing the chance of a certain series of independent events, bias of some kind does seem to enter. This is probably (um, heh) because, although we might be fairly frequentist when it comes to notably and frequent events, we don’t naturally note any given series of independent events as an event in itself. (For one thing, the sheer combinatorics prevent even encountering many such series.)
I wouldn’t be surprised if the ultimate synthesis shows our brains are effectively frequentist (even almost optimally so) when it comes the sorts of situations they evolved under, but also that these evolved optimizations break under conditions found in our increasingly artificial world. One does not find things much like coins in nature, nor much reason for people to use their computed fairness to resolve issues.
Haha, a similar argument could justify numerical “degrees of tastiness” for sandwiches.
Are you denying degrees of tastiness?!?
I’m not trying to justify numerical degrees of anything in the human brain. We’re talking models here, not physiccal reality. If you can’t see that, cousin it, well, you can go squeak somewhere else in the Addams family mansion.