I agree with your disclaimers that not all people go crazy when they start talking politics, and not always the predicted bad things happen. Problem is, I already see how most people would react to a text saying that sometimes, some people go crazy when talking politics: “Meh, ‘some people’, that definitely doesn’t apply to me. Now let me start screaming about why unconditionally supporting my faction is the most important thing ever, and why everyone who doesn’t join us is inherently evil and deserves to die painfully.” Or just keep inserting their political beliefs in every other discussion endlessly, because “hey, my political beliefs are rational (unlike political beliefs of those idiots who disagree with me), and this is a website about rationality, therefore it is important for people here to discuss and accept my political beliefs. If they disagree with me, they fail at rationality forever.”
We tried to debate politics here; it usually failed. Apparently, believing in one’s own rationality is not enough.
(There is also another way how political topics can destroy rational debate: they attract people who don’t really care about the main topic of this website, but only came here to fight for a specific political belief.)
From my perspective, the main problem of “rationality vs politics” is that in a political fight, being transparent about your beliefs is usually not a winning strategy. (Saying “I am 80% sure I am right” is not going to bring masses to your side. Neither is replying to slogans and tweets by peer-reviewed articles full of numbers.) If you had a completely honest debate about politics, it would have to be done in private, because the participants would have to write things that could ruin their political careers if quoted publicly. (Imagine things like: “Yeah, I know that this specific important person in our party is a criminal, or that this specific popular argument is actually a lie, but I still support them because the future where they prevail seems like a lesser evil compared to the alternatives, for the following reasons: …”) So you get the multiplayer Prisoners’ Dilemma with high motivation to defect, because breaking the rules of the game in favor of doing the right thing (which is how acting on a strong political belief feels from inside) seems like the right thing to do.
I agree that the evidence currently suggests that being transparent about your beliefs isn’t a winning strategy.
I’m not 100% convinced that we can conclusively say that it can’t be a winning strategy. Perhaps if norms change, and someone who was sufficiently well-calibrated to this way of thinking and was effective at communicating (and probably came across as high-status enough) it could be very effective.
Could someone be completely honest and still be effective? I’d love to see someone who could pull this off, and I haven’t written this off as a possibility. But maybe I’m being naive :-)
The effectivity of truth and lying depends on environment. For example, imagine a culture where political debates on TV would be immediately followed by impartial fact checking. Or a culture where politicians have to make predictions about future events (“I don’t know” also counts as a valid prediction), and these are later publicly reviewed and evaluated. And, importantly, where the citizens actually care about the results. I suppose such environment would bring more truth in politics.
But this is a chicken-and-egg problem, because changing the environment, that’s kinda what politics is about. Also, there are many obvious counter-strategies, such as having loyal people do the “fact checking” in your tribe’s favor. (For example, when a politician says something that is approximately correct, like saying that some number is 100, when in reality it is 96, it would be evaluated as “a correct approximation → TRUE” when your side does it, or as “FALSE” when your opponent does it. You could evaluate opponent’s metaphorical statements literally, but the other way round for your allies; etc.)
Could someone be completely honest and still be effective?
That mostly depends on other people. Such as voters (whether they bother to check facts) and media (whether they report on the fact that your statements are more likely to be true). If instead the media decide to publish a completely made up story about you, and most readers accept the story uncritically, you are screwed.
(There are also ways to hurt 100% honest people without lying about them, such as making them publicly answer a question where the majority of the population believes a wrong answer and gets offended by hearing the correct one. “Is God real?”)
I agree with everything you say. I’m reminded of Dan Gardner’s terrific book Future Babble on how and why people respond to pundits even though their predictions/calibrations are often woeful.
The thing is, in the same way that there are people who can get away with clearly being in bad faith and not being truthful, I think there are probably some people who can get away with being relatively well-calibrated and up-front about not treating everything in black and white terms, and still be effective communicators, and effective in politics in general.
I wouldn’t be surprised if there are some other factors (something to do with the conveyed social status of the person in question?) that relate to their effectiveness that are in some ways independent of the positions they actually take. Taking the “Is God real?” example, it’s true that the vast majority of people couldn’t get away with that. But there are probably some people who could get away with it.
I’m speculating. And to my mind this is an empirical question. The fact that no one comes to mind probably indicates I’m wrong. But I can always be hopeful!
I’m also writing this in a rush, so apologies if I haven’t been very clear. Thanks for the comments!
I agree with your disclaimers that not all people go crazy when they start talking politics, and not always the predicted bad things happen. Problem is, I already see how most people would react to a text saying that sometimes, some people go crazy when talking politics: “Meh, ‘some people’, that definitely doesn’t apply to me. Now let me start screaming about why unconditionally supporting my faction is the most important thing ever, and why everyone who doesn’t join us is inherently evil and deserves to die painfully.” Or just keep inserting their political beliefs in every other discussion endlessly, because “hey, my political beliefs are rational (unlike political beliefs of those idiots who disagree with me), and this is a website about rationality, therefore it is important for people here to discuss and accept my political beliefs. If they disagree with me, they fail at rationality forever.”
We tried to debate politics here; it usually failed. Apparently, believing in one’s own rationality is not enough.
(There is also another way how political topics can destroy rational debate: they attract people who don’t really care about the main topic of this website, but only came here to fight for a specific political belief.)
From my perspective, the main problem of “rationality vs politics” is that in a political fight, being transparent about your beliefs is usually not a winning strategy. (Saying “I am 80% sure I am right” is not going to bring masses to your side. Neither is replying to slogans and tweets by peer-reviewed articles full of numbers.) If you had a completely honest debate about politics, it would have to be done in private, because the participants would have to write things that could ruin their political careers if quoted publicly. (Imagine things like: “Yeah, I know that this specific important person in our party is a criminal, or that this specific popular argument is actually a lie, but I still support them because the future where they prevail seems like a lesser evil compared to the alternatives, for the following reasons: …”) So you get the multiplayer Prisoners’ Dilemma with high motivation to defect, because breaking the rules of the game in favor of doing the right thing (which is how acting on a strong political belief feels from inside) seems like the right thing to do.
I agree that the evidence currently suggests that being transparent about your beliefs isn’t a winning strategy.
I’m not 100% convinced that we can conclusively say that it can’t be a winning strategy. Perhaps if norms change, and someone who was sufficiently well-calibrated to this way of thinking and was effective at communicating (and probably came across as high-status enough) it could be very effective.
Could someone be completely honest and still be effective? I’d love to see someone who could pull this off, and I haven’t written this off as a possibility. But maybe I’m being naive :-)
The effectivity of truth and lying depends on environment. For example, imagine a culture where political debates on TV would be immediately followed by impartial fact checking. Or a culture where politicians have to make predictions about future events (“I don’t know” also counts as a valid prediction), and these are later publicly reviewed and evaluated. And, importantly, where the citizens actually care about the results. I suppose such environment would bring more truth in politics.
But this is a chicken-and-egg problem, because changing the environment, that’s kinda what politics is about. Also, there are many obvious counter-strategies, such as having loyal people do the “fact checking” in your tribe’s favor. (For example, when a politician says something that is approximately correct, like saying that some number is 100, when in reality it is 96, it would be evaluated as “a correct approximation → TRUE” when your side does it, or as “FALSE” when your opponent does it. You could evaluate opponent’s metaphorical statements literally, but the other way round for your allies; etc.)
That mostly depends on other people. Such as voters (whether they bother to check facts) and media (whether they report on the fact that your statements are more likely to be true). If instead the media decide to publish a completely made up story about you, and most readers accept the story uncritically, you are screwed.
(There are also ways to hurt 100% honest people without lying about them, such as making them publicly answer a question where the majority of the population believes a wrong answer and gets offended by hearing the correct one. “Is God real?”)
I agree with everything you say. I’m reminded of Dan Gardner’s terrific book Future Babble on how and why people respond to pundits even though their predictions/calibrations are often woeful.
The thing is, in the same way that there are people who can get away with clearly being in bad faith and not being truthful, I think there are probably some people who can get away with being relatively well-calibrated and up-front about not treating everything in black and white terms, and still be effective communicators, and effective in politics in general.
I wouldn’t be surprised if there are some other factors (something to do with the conveyed social status of the person in question?) that relate to their effectiveness that are in some ways independent of the positions they actually take. Taking the “Is God real?” example, it’s true that the vast majority of people couldn’t get away with that. But there are probably some people who could get away with it.
I’m speculating. And to my mind this is an empirical question. The fact that no one comes to mind probably indicates I’m wrong. But I can always be hopeful!
I’m also writing this in a rush, so apologies if I haven’t been very clear. Thanks for the comments!