(Not replying “at the original post” because others haven’t and now this discussion is here.)
That fragment of “Final Words” is in a paragraph of consequences of underconfidence.
Suppose (to take a standard sort of toy problem) you have a coin which you know either comes up heads 60% of the time or comes up heads 40% of the time. (Note: in the real world there are probably no such coins, at least not if they’re tossed in a manner not designed to enable bias. But never mind.) And suppose you have some quantity of evidence about which sort of coin it is—perhaps derived from seeing the results of many tosses. If you’ve been tallying them up carefully then there’s not much room for doubt about the strength of your evidence, so let’s say you’ve just been watching and formed a general idea.
Underconfidence would mean e.g. that you’ve seen an excess of T over H over a long period, but your sense of how much information that gives you is wrong, so you think (let’s say) there’s a 55% chance that it’s a T>H coin rather than an H>T coin. So then someone trustworthy comes along and tells you he tossed the coin once and it came up H. That has probability 60% on the H>T hypothesis and probability 40% on the T>H hypothesis, so it’s 3:2 evidence for H>T, so if you immediately have to bet a large sum on either H or T you should bet it on H.
But maybe the _real_ state of your evidence before this person’s new information justifies 90% confidence that it’s a T>H coin, in which case that new information leaves you still thinking it’s more likely T>H, and if you immediately have to bet a large sum you should bet it in T.
Thus: if you are underconfident you may take advice you shouldn’t, because you underweight what you already know relative to what others can tell you.
Note that this is all true even if the other person is scrupulously honest, has your best interests at heart, and agrees with you about what those interests are.
I’d trust myself not to follow bad advice. I’d probably be willing to ask a person I didn’t respect very much for advice, even if I knew I wasn’t going to follow it, just as a chance to explain why I’m going to do what I’m going to do, so that they understand why we disagree, and don’t feel like I’m just ignoring them. You can’t create an atmosphere of fake agreement by just not confronting the disagreement. They’ll see what you’re doing.
Can asking for advice be bad? From Eliezer’s post Final Words:
I understand that this means to just ask for advice, not necessarily follow it. Why can this be a bad thing? For a true Bayesian, information would never have negative expected utility. But humans aren’t perfect Bayes-wielders; if we’re not careful, we can cut ourselves. How can we cut ourselves in this case? I suppose you could have made up your mind to follow a course of action that happens to be correct and then ask someone for advice and the someone will change your mind.
Is there more to it? Please reply at the original post: Final Words.
(Not replying “at the original post” because others haven’t and now this discussion is here.)
That fragment of “Final Words” is in a paragraph of consequences of underconfidence.
Suppose (to take a standard sort of toy problem) you have a coin which you know either comes up heads 60% of the time or comes up heads 40% of the time. (Note: in the real world there are probably no such coins, at least not if they’re tossed in a manner not designed to enable bias. But never mind.) And suppose you have some quantity of evidence about which sort of coin it is—perhaps derived from seeing the results of many tosses. If you’ve been tallying them up carefully then there’s not much room for doubt about the strength of your evidence, so let’s say you’ve just been watching and formed a general idea.
Underconfidence would mean e.g. that you’ve seen an excess of T over H over a long period, but your sense of how much information that gives you is wrong, so you think (let’s say) there’s a 55% chance that it’s a T>H coin rather than an H>T coin. So then someone trustworthy comes along and tells you he tossed the coin once and it came up H. That has probability 60% on the H>T hypothesis and probability 40% on the T>H hypothesis, so it’s 3:2 evidence for H>T, so if you immediately have to bet a large sum on either H or T you should bet it on H.
But maybe the _real_ state of your evidence before this person’s new information justifies 90% confidence that it’s a T>H coin, in which case that new information leaves you still thinking it’s more likely T>H, and if you immediately have to bet a large sum you should bet it in T.
Thus: if you are underconfident you may take advice you shouldn’t, because you underweight what you already know relative to what others can tell you.
Note that this is all true even if the other person is scrupulously honest, has your best interests at heart, and agrees with you about what those interests are.
I’d trust myself not to follow bad advice. I’d probably be willing to ask a person I didn’t respect very much for advice, even if I knew I wasn’t going to follow it, just as a chance to explain why I’m going to do what I’m going to do, so that they understand why we disagree, and don’t feel like I’m just ignoring them. You can’t create an atmosphere of fake agreement by just not confronting the disagreement. They’ll see what you’re doing.
That’s because they already have it (in a sense that we don’t). They know every way any experiment could go (if not which one it will).
You have more at stake than they do. (Also watch out for if they have vested interests.)
EDIT: If you have an amazing knockdown counter-argument, please share it.