I see where you’re coming from, and I agree that probabilistic inference is a more precise term that should be used instead. But I think my sense of shouldness there is much smaller than your sense of shouldness, and I can’t tell if that’s more tolerance for verbal sloppiness on my part or some sort of regret at narrowing the term Bayesian inference. I think I endorse the first but not the second explanation.
I just find LW’s often uninformed “Bayesianism” annoying, that’s all. It’s classic belief as attire—I am willing to bet having an opinion on these sorts of things make zero (0) practical difference in most people’s lives.
In fact, lots of things in the “LW cluster” are like this—MWI for instance.
edit: lest you think I am joking, here’s someone who calls the g-formula “an ad hoc frequentist device”:
which amounts to saying “get more data!” but which starts as follows:
“Bayesian epistemology and decision theory provide a rigorous foundation for dealing with mixed or ambiguous evidence, uncertainty, and risky decisions.”
which is a complete non-sequitur to the main point. I could keep going, but I hope the point is clear. There is an alarming lack of clue coupled with an alarming amount of “rah rah Bayes”. (Note: in case it is not obvious, I have no horse in the B vs F race at all, I am happy to write papers in either formalism. My point is not about B vs F at all.)
Yes I do (because Bayes’ law is a triviality—it’s like saying any use of fractions is frequentist).
I see where you’re coming from, and I agree that probabilistic inference is a more precise term that should be used instead. But I think my sense of shouldness there is much smaller than your sense of shouldness, and I can’t tell if that’s more tolerance for verbal sloppiness on my part or some sort of regret at narrowing the term Bayesian inference. I think I endorse the first but not the second explanation.
I just find LW’s often uninformed “Bayesianism” annoying, that’s all. It’s classic belief as attire—I am willing to bet having an opinion on these sorts of things make zero (0) practical difference in most people’s lives.
In fact, lots of things in the “LW cluster” are like this—MWI for instance.
edit: lest you think I am joking, here’s someone who calls the g-formula “an ad hoc frequentist device”:
http://lesswrong.com/lw/hwq/evidential_decision_theory_selection_bias_and/9crr
Here’s a post by Nyan Sandwich
http://lesswrong.com/lw/irj/crush_your_uncertainty/
which amounts to saying “get more data!” but which starts as follows:
“Bayesian epistemology and decision theory provide a rigorous foundation for dealing with mixed or ambiguous evidence, uncertainty, and risky decisions.”
which is a complete non-sequitur to the main point. I could keep going, but I hope the point is clear. There is an alarming lack of clue coupled with an alarming amount of “rah rah Bayes”. (Note: in case it is not obvious, I have no horse in the B vs F race at all, I am happy to write papers in either formalism. My point is not about B vs F at all.)
Do you also get annoyed when people say “prior” when they mean “probability estimate”? As in, “my priors have been updated”.