Maybe this says more about me than about the world, but if this was StackOverflow, this comment would get the star. Thanks.
enfascination
If there was one element of statistical literacy that you could magically implant in every head, what would it be?
I believe you’re confusing arrogance and closedmindedness.
Well, maybe that’s the question. They’re different, and you can have one without the other, but do they cooccur above chance? Maybe arrogance reduces your exposure to the occasional clever ideas that will inevitably come from people you’ve dismissed. That isn’t closemindedness, it something more like as-if-closemindedness, but it would come to the same thing.
Clean real-world example of the file-drawer effect
I don’t buy arguments of the form “it must be good otherwise we wouldn’t do it,” but that’s just a quibble. I’d buy a signaling argument and you’re right that I’m not clear on my terms. This is a stab, but the way I think I’m using arrogance is as using your high abilities to justify a inflated sense of self worth. OK, applying that back to the question, I don’t see how an inflated sense of self-worth could make you a worse critical thinker. Maybe? I have to think about it more.
Is arrogance a symptom of bad intellectual hygeine?
This work articulates an attack on the use of conjugate priors in a Bayesian analysis: http://projecteuclid.org/DPubS?service=UI&version=1.0&verb=Display&handle=euclid.ba/1340369826 In their words, “conjugate priors may lead to a dogmatic analysis.”
Sorry for necro.
This post is less about The Truth that and more about science as a personal endeavor, as something you do on yourself to be a better thinker, or not.
Sorry if I wasn’t clear. I want to be good at admitting that I was in error, and to collect cases of great thinkers who failed to do that. So yes, I want to be less wrong. Admitting one’s errors is a tool in the less wrong toolbox. We like to think we’re good at it, that its easy, that we’re detached, but I’ve seen that being content with my level of self-criticality creates complacency and fosters lapses. These examples demonstrate it.
On the subject of admitting one’s errors, I think DanArmak is right that Newton doesn’t belong on the list if his opinions of alchemy were representative of the time. To replace him, two others from my list of leads: Ernst Haeckel on Lemuria and Jagadish Chandra Bose on sensation/perception in plants and inorganic compounds.
Thanks for asking. I accidentally responded off the main thread.
Hi, I wrote the post. I want to be good at being wrong, I want to be excellent at it. I aspire to develop habits of thought that will protect me and my peers from nursing too gently the need to be right. I thought I might learn something from the ugliest cases.
What happens to even the greatest minds that causes them to get attached to their theories? I don’t know, but your examples will help me find out. A history of baggage? A reputation to protect? Mere age? Pure guts? I agree with pragmatist that Einstein’s concerns about QM aren’t a great example of the prompt. Hoyle, on the other hand, is a great example—he resisted the Big Bang to his death—for decades after it had become the most plausible model.
Each of the bulleted examples up top was a great mind with too much emotional baggage to keep from being left behind by science. I don’t want it to happen to me, and generally I want to cultivate in scientific discourse a tone that makes it safe for even the most agitated reasoner to bow out with grace. Thanks for your input and for your leads.
I think a lot about signal detection theory, and I think that’s still the best I can come up with for this question. There are false positives, there are false negatives, they are both important to keep in mind, the cost of reducing one is an increase in the other, humans and human systems will always have both.
So, for example, even the most over-generous public welfare system will have deserving people off the dole and even the most stingy system will have undeserving recipients (by whatever definition), so the question (for a welfare system, say) isn’t how do we prevent abuse, but how many abusers are we willing to tolerate for every 100 deserving recipients we reject? Also useful in lots of medical discussions, legal discussions, pop science discussions, etc.