Probably an obvious point: epistemically that’s an error, but politically it’s probably an indispensable tactic. Say you do an honest and perfectly reliable utilitarian analysis, and find that chimpanzees really should not be used in research; the real substantial medical advances are not worth their suffering. But frustratingly the powers that be don’t care about chimps as much as they should. Your only hope is to convince them that chimp-using research is nearly useless to humans, so that even their undersized compassion for chimps will convince them to shut the research down.
I have a kind of romantic suspicion that nearly all politically active people are like this, that if you could somehow get them alone and sit them down and ask them what they really think, they’d go, “Yes, congratulations Einstein, you figured it out. Of course if we succeed then it’s likely the lives of [some group] will get a lot worse, but, well, omelets and eggs.” And then they swear you in and give you a membership card, because if you’ve gotten this far, then you can also see that they’re justified.
I have a kind of romantic suspicion that nearly all politically active people are like this
That would be nice. But if all the politicians were so rational, then why in the name of Aumann’s agreement theorem would they disagree with each other so much?
Unless that too would be some kind of deception, necessary to achieve maximum utility. Maybe the average stupid humans (non-politicians) simply need to see a few battling factions, so if all these rational politicians suddenly stopped pretending to disagree with each other, the angry voters would vote for someone genuinely stupid, just to have more variety.
Well… I suppose politicians are on average more rational than average humans. At least instrumentally; this is why they are in politics, have power and make $$$, while the average citizen spends their time merely watching them on TV. And probably even epistemically; because I expect epistemic rationality to correlate somehow positively with instrumental rationality. And because there are some things that politicians must pretend, strategically, I expect them to be less mindkilled than they seem. And they also have better information on political topics. -- But all this considered, I think they are also prone to all human biases, just perhaps a bit less than the average human.
Maybe the average stupid humans (non-politicians) simply need to see a few battling factions, so if all these rational politicians suddenly stopped pretending to disagree with each other, the angry voters would vote for someone genuinely stupid, just to have more variety.
I think it makes more sense to look at the incentives of politicians. Politians want to win. They want to be reelected.
That means they have to somehow appear to be better than the other party.
Most politicans also think about their career. They have to impress fellow politicians.
I expect them to be less mindkilled than they seem.
(Nods) that’s really what I was trying to say, yeah.
Also it’s worth an NB that the AAT only applies to epistemic agreement, right? It doesn’t prevent groups from competing over resources: we agree that the pie is tasty, which is precisely why we’re fighting over it. Of course if you’re committed to fighting, then screwing with your enemy’s, and partially-committed ally’s, models of the world is a valid combat tactic.
But frustratingly the powers that be don’t care about chimps as much as they should. Your only hope is to convince them that chimp-using research is nearly useless to humans, so that even their undersized compassion for chimps will convince them to shut the research down.
The problem is that these kinds of lies create a viscous cycle. Someone who shares your utility function and honestly believes your lies will want to shut down research even in cases you wouldn’t and will feel justified inventing lies (on top of the lies he believes to be true) to promote that position. Then people start believing those lies and so feel justified inventing further lies, etc.
Is there a name for a bias that causes you to ignore trade-offs and pretend that there are no costs to doing something, for example claiming that eliminating the use of chimps in medical research won’t harm medical research because “scientific methods and technologies have rendered their use in research largely unnecessary”?
Probably an obvious point: epistemically that’s an error, but politically it’s probably an indispensable tactic. Say you do an honest and perfectly reliable utilitarian analysis, and find that chimpanzees really should not be used in research; the real substantial medical advances are not worth their suffering. But frustratingly the powers that be don’t care about chimps as much as they should. Your only hope is to convince them that chimp-using research is nearly useless to humans, so that even their undersized compassion for chimps will convince them to shut the research down.
I have a kind of romantic suspicion that nearly all politically active people are like this, that if you could somehow get them alone and sit them down and ask them what they really think, they’d go, “Yes, congratulations Einstein, you figured it out. Of course if we succeed then it’s likely the lives of [some group] will get a lot worse, but, well, omelets and eggs.” And then they swear you in and give you a membership card, because if you’ve gotten this far, then you can also see that they’re justified.
That would be nice. But if all the politicians were so rational, then why in the name of Aumann’s agreement theorem would they disagree with each other so much?
Unless that too would be some kind of deception, necessary to achieve maximum utility. Maybe the average stupid humans (non-politicians) simply need to see a few battling factions, so if all these rational politicians suddenly stopped pretending to disagree with each other, the angry voters would vote for someone genuinely stupid, just to have more variety.
Well… I suppose politicians are on average more rational than average humans. At least instrumentally; this is why they are in politics, have power and make $$$, while the average citizen spends their time merely watching them on TV. And probably even epistemically; because I expect epistemic rationality to correlate somehow positively with instrumental rationality. And because there are some things that politicians must pretend, strategically, I expect them to be less mindkilled than they seem. And they also have better information on political topics. -- But all this considered, I think they are also prone to all human biases, just perhaps a bit less than the average human.
I think it makes more sense to look at the incentives of politicians. Politians want to win. They want to be reelected. That means they have to somehow appear to be better than the other party.
Most politicans also think about their career. They have to impress fellow politicians.
(Nods) that’s really what I was trying to say, yeah.
Also it’s worth an NB that the AAT only applies to epistemic agreement, right? It doesn’t prevent groups from competing over resources: we agree that the pie is tasty, which is precisely why we’re fighting over it. Of course if you’re committed to fighting, then screwing with your enemy’s, and partially-committed ally’s, models of the world is a valid combat tactic.
The problem is that these kinds of lies create a viscous cycle. Someone who shares your utility function and honestly believes your lies will want to shut down research even in cases you wouldn’t and will feel justified inventing lies (on top of the lies he believes to be true) to promote that position. Then people start believing those lies and so feel justified inventing further lies, etc.
I’d call it scope insensitivity, and if I was feeling snippy, I’d call it motivated scope insensitivity.
See also “sacred values”, though I’m not sure that they’re a bias.
Halo effect?