I think you’re coming at this question from a very different perspective than I do, but for me, it isn’t about a mathematical result, which, as you and others noted, is very simple and entirely uncontroversial. I also expect it to be uncontroversial for say that, for most people, most of the time, it is not a useful idea to try to explicitly calculate conditional probabilities for making predictions and decisions. Instead, for me what matters is that some of the very basic, mathematically obvious, and uncontroversial implications of conditional probability calculations have no bearing at all on how most of the people in our world make decisions, not just personal decisions but ones with enormous consequences for our society and our collective futures.
In my experience most people, when trying to reason logically, think by default in binary categories and have no idea how to deal with uncertainty. I work in a job where my goal is to advise others on making various decisions based on my interpretation of often ambiguous technical, industry, and other data, and to help train others to do the same. I have never once made an explicit calculation using Bayes’ theorem, but I consistently observe that a surprisingly (to me) large fraction of people start out extremely resistant to making reasonable assumptions based on prior knowledge and seeing where they lead, and instead insist that if they don’t have “certainty” they can’t say anything useful at all. Those people much more often fail to make good predictions, miss out on large opportunities, and waste exorbitant amounts of time and effort chasing down details that do not meaningfully affect the final output of their prediction and decision making processes. They’re also more likely to miss huge, gaping holes in their models of the world that make all their careful attempts to be certain completely meaningless.
In other words, for me the point is to develop a better intuitive understanding of how to make reasoned decisions that are likely to lead to desired outcomes—aka to become less wrong, and wrong less often. It’s partly because Bayes’ theorem is so simple that, for a certain type of person, it serves as a useful entry point on the path to that goal.
This strikes me as both a good explanation of why people are excited by Bayes’ theorem and why they/I can come off as frustrated that more people don’t seem to worship it enough. Basically I view the equation as the mathematical way of saying “you should form and update your beliefs based on evidence.” Which, I think as any rationalist would agree, is both (a) not as obvious and straightforward as it sounds, and (b) would make the world better if more people heeded the call.
I think you’re coming at this question from a very different perspective than I do, but for me, it isn’t about a mathematical result, which, as you and others noted, is very simple and entirely uncontroversial. I also expect it to be uncontroversial for say that, for most people, most of the time, it is not a useful idea to try to explicitly calculate conditional probabilities for making predictions and decisions. Instead, for me what matters is that some of the very basic, mathematically obvious, and uncontroversial implications of conditional probability calculations have no bearing at all on how most of the people in our world make decisions, not just personal decisions but ones with enormous consequences for our society and our collective futures.
In my experience most people, when trying to reason logically, think by default in binary categories and have no idea how to deal with uncertainty. I work in a job where my goal is to advise others on making various decisions based on my interpretation of often ambiguous technical, industry, and other data, and to help train others to do the same. I have never once made an explicit calculation using Bayes’ theorem, but I consistently observe that a surprisingly (to me) large fraction of people start out extremely resistant to making reasonable assumptions based on prior knowledge and seeing where they lead, and instead insist that if they don’t have “certainty” they can’t say anything useful at all. Those people much more often fail to make good predictions, miss out on large opportunities, and waste exorbitant amounts of time and effort chasing down details that do not meaningfully affect the final output of their prediction and decision making processes. They’re also more likely to miss huge, gaping holes in their models of the world that make all their careful attempts to be certain completely meaningless.
In other words, for me the point is to develop a better intuitive understanding of how to make reasoned decisions that are likely to lead to desired outcomes—aka to become less wrong, and wrong less often. It’s partly because Bayes’ theorem is so simple that, for a certain type of person, it serves as a useful entry point on the path to that goal.
This strikes me as both a good explanation of why people are excited by Bayes’ theorem and why they/I can come off as frustrated that more people don’t seem to worship it enough. Basically I view the equation as the mathematical way of saying “you should form and update your beliefs based on evidence.” Which, I think as any rationalist would agree, is both (a) not as obvious and straightforward as it sounds, and (b) would make the world better if more people heeded the call.