Instead of Quinn admitting lying is sometimes good, I wish he had said something like:
“PADP is widely considered a good charity by smart people who we trust. So we have a prior on it being good. You’ve discovered some apparent evidence that it’s bad. So now we have to combine the prior and the evidence, and we end up with some percent confidence that they’re bad.
If this is 90% confidence they’re bad, go ahead. What if it’s more like 55%? What’s the right action to take if you’re 55% sure a charity is incompetent and dishonest (but 45% chance you misinterpreted the evidence)? Should you call them out on it? That’s good in the world where you’re right, but might disproportionately tarnish their reputation in the world where they’re wrong. It seems like if you’re 55% sure, you have a tough call. You might want to try something like bringing up your concerns privately with close friends and only going public if they share your opinion, or asking the charity first and only going public if they can’t explain themselves. Or you might want to try bringing up your concerns in a nonconfrontational way, more like ‘Can anyone figure out what’s going on with PADP’s math?’ rather than ‘PADP is dishonest’. After this doesn’t work and lots of other people confirm your intuitions of distrust, then your confidence reaches 90% and you start doing things more like shouting ‘PADP is dishonest’ from the rooftops.
Or maybe you’ll never reach 90% confidence. Many people think that climate science is dishonest. I don’t doubt many of them are reporting their beliefs honestly—that they’ve done a deep investigation and that’s what they’ve concluded. It’s just that they’re not smart, informed, or rational enough to understand what’s going on, or to process it in an unbiased way. What advice would you give these people about calling scientists out on dishonesty—again given that rumors are powerful things and can ruin important work? My advice to them would be to consider that they may be overconfident, and that there needs to be some intermediate ‘consider my own limitations and the consequences of my irreversible actions’ step in between ‘this looks dishonest to me’ and ‘I will publicly declare it dishonest’. And that step is going to look like an appeal to consequences, especially if the climate deniers are so caught up in their own biases that they can’t imagine they might be wrong.
I don’t want to deny that calling out apparent dishonesty when you’re pretty sure of it, or when you’ve gone through every effort you can to check it and it still seems bad, will sometimes (maybe usually) be the best course, but I don’t think it’s as simple as you think.”
The part about climate science seems like a pretty bog-standard outside view argument, which in turn means I find it largely uncompelling. Yes, there are people who are so stupid, they can only be saved from their own stupidity by executing an epistemic maneuver that works regardless of the intelligence of the person executing it. This does not thereby imply that everyone should execute the same maneuver, including people who are not that stupid, and therefore not in need of saving. If someone out there is so incompetent that they mistakenly perceive themselves as competent, then they are already lost, and the fact that an illegal (from the perspective of normative probability theory) epistemic maneuver exists which would save them if they executed it, does not thereby make that maneuver a normatively good move. (And even if it were, it’s not as though the people who would actually benefit from said maneuver are going to execute it—the whole reason that such people are loudly, confidently mistaken is that they don’t take the outside view seriously.)
In short: there is simply no principled justification for modesty-based arguments, and—though it may be somewhat impolite to say—I agree with Eliezer that people who find such arguments compelling are actually being influenced by social modesty norms (whether consciously or unconsciously), rather than any kind of normative judgment. Based on variouspoststhat Scott has written in the past, I would venture to say that he may be one of those people.
Instead of Quinn admitting lying is sometimes good, I wish he had said something like:
...and seen what Carter answered.
Part of this is pretty close to what I wrote in the actual debate. The part about climate science is new though and I’d like to see a response to it.
The part about climate science seems like a pretty bog-standard outside view argument, which in turn means I find it largely uncompelling. Yes, there are people who are so stupid, they can only be saved from their own stupidity by executing an epistemic maneuver that works regardless of the intelligence of the person executing it. This does not thereby imply that everyone should execute the same maneuver, including people who are not that stupid, and therefore not in need of saving. If someone out there is so incompetent that they mistakenly perceive themselves as competent, then they are already lost, and the fact that an illegal (from the perspective of normative probability theory) epistemic maneuver exists which would save them if they executed it, does not thereby make that maneuver a normatively good move. (And even if it were, it’s not as though the people who would actually benefit from said maneuver are going to execute it—the whole reason that such people are loudly, confidently mistaken is that they don’t take the outside view seriously.)
In short: there is simply no principled justification for modesty-based arguments, and—though it may be somewhat impolite to say—I agree with Eliezer that people who find such arguments compelling are actually being influenced by social modesty norms (whether consciously or unconsciously), rather than any kind of normative judgment. Based on various posts that Scott has written in the past, I would venture to say that he may be one of those people.