Now THAT part is just plain embarrassing. I mean, it’s truly a mark of shame upon us if we have a tool that we know works, we are given access to the tool, and we still can’t do better than the tool itself, unaided.
Coincidentally, I was planning to write an article “defending” the use of fallacies on Bayesian grounds. A typical passage would go like this:
People say it’s fallacious to appeal to authority. However, if you learn that experts believe X, you should certainly update some finite amount in favor of believing X, as experts are, in general, more likely to believe X if it is true than it is false—even as you may find many exceptions.
Indeed, it would be quite a strange world if experts were consistently wrong about a given subject matter X, thus making their opinions for X into evidence against X, because they would have to persist in this error, even knowing that their entanglement with X means they only have to invert their pronouncements or remain agnostic to improve accuracy.
Well, it seems we actually do live in such a world, where (some classes of) experts make predictable errors, and don’t take trivial steps to make their opinions more accurate (and entangled with the subject matter).
Well, experts still do better than non-experts on average (afaik), just that they seem to totally ignore tools that could let them do a whole lot better, and also apparently can’t do much better than the tools themselves, even when they’re able to use the tools.
Coincidentally, I was planning to write an article “defending” the use of fallacies on Bayesian grounds. A typical passage would go like this:
Well, it seems we actually do live in such a world, where (some classes of) experts make predictable errors, and don’t take trivial steps to make their opinions more accurate (and entangled with the subject matter).
Well, experts still do better than non-experts on average (afaik), just that they seem to totally ignore tools that could let them do a whole lot better, and also apparently can’t do much better than the tools themselves, even when they’re able to use the tools.
Making predictable errors isn’t the same thing as their opinions being anti-correlated with reality.