I think this is what many people find confusing about Bayesian reasoning: the nods to subjectivity. Some people I’ve discussed it with often say things like “But it’s still subjective!” or “You’re pulling those numbers out of your ass!” Well, yes, maybe. But calibration games, immediate-feedback systems like the Good Judgement Project, and general awareness of priors are being shown as better than chance, and better than intuition, when the RESULTS are objectively measured.
If I had to speculate on why people react so negatively, I’d say it’s because of the false dichotomy between “objective” and “subjective.” Objective: numbers, math, computer programs. Subjective: fuzzy, things that aren’t science. So, saying that something can be evidence in a rational approach AND be subjective...is confusing. They aren’t thinking about weighting the evidence accordingly. They’re annoyed that it’s counted at all.
I think this is what many people find confusing about Bayesian reasoning: the nods to subjectivity. Some people I’ve discussed it with often say things like “But it’s still subjective!” or “You’re pulling those numbers out of your ass!” Well, yes, maybe. But calibration games, immediate-feedback systems like the Good Judgement Project, and general awareness of priors are being shown as better than chance, and better than intuition, when the RESULTS are objectively measured.
If I had to speculate on why people react so negatively, I’d say it’s because of the false dichotomy between “objective” and “subjective.” Objective: numbers, math, computer programs. Subjective: fuzzy, things that aren’t science. So, saying that something can be evidence in a rational approach AND be subjective...is confusing. They aren’t thinking about weighting the evidence accordingly. They’re annoyed that it’s counted at all.