Thanks for that comment! Eliezer often says people should be more sensitive to evidence, but an awful lot of real-life evidence is in fact much weaker, noisier, and easier to misinterpret than it seems. And it’s not enough to just keep in mind a bunch of Bayesian mantras—you need to be aware of survivor bias, publication bias, Simpson’s paradox and many other non-obvious traps, otherwise you silently go wrong and don’t even know it. In a world where most published medical results fail to replicate, how much should we trust our own conclusions?
Would it be more honest to recommend people to just never update at all? But then everyone will stick to their favorite theories forever… Maybe an even better recommendation would be to watch out for “motivated cognition”, try to be more skeptical of all theories including your favorites.
Thanks for that comment! Eliezer often says people should be more sensitive to evidence, but an awful lot of real-life evidence is in fact much weaker, noisier, and easier to misinterpret than it seems. And it’s not enough to just keep in mind a bunch of Bayesian mantras—you need to be aware of survivor bias, publication bias, Simpson’s paradox and many other non-obvious traps, otherwise you silently go wrong and don’t even know it. In a world where most published medical results fail to replicate, how much should we trust our own conclusions?
Would it be more honest to recommend people to just never update at all? But then everyone will stick to their favorite theories forever… Maybe an even better recommendation would be to watch out for “motivated cognition”, try to be more skeptical of all theories including your favorites.