As I read through, the core model fit well with my intuition. But then I was surprised when I got to the section on religious schisms! I wondered why we should model the adherents of a religion as trying to join the school with the most ‘accurate’ claims about the religion.
On reflection, it appears to me that the model probably holds roughly as well in the religion case as the local radio intellectual case. Both of those are examples of “hostile” talking up. I wonder if the ways in which those cases diverge from pure information sharing explains the difference between humble and hostile.
In particular, perhaps some audiences are looking to reduce cognitive dissonance between their self-image as unbiased on the one hand and their particular beliefs and preferences on the other. That leaves an opening for someone to sell reasonableness/unbiasedness self-image to people holding a given set of beliefs and preferences.
Someone making reasonable counterarguments is a threat to what you’ve offered, and in that case your job is to provide refutation, counterargument and discredit so it is easy for that person’s arguments to be dismissed (through a mixture of claimed flaws in their arguments and claimed flaws in the person promoting them). This would be a ‘hostile’ talking up.
Also, we should probably expect to find it hard to distinguish between some hostile talking ups and overconfident talking downs. If we could always distinguish, hostile talking up is a clear signal of defeat.
I would definitely agree that people are generally reluctant to blatantly deceive themselves. There is definitely some cost to incorrect beliefs, though it can vary greatly in magnitude depending on the situation.
For instance, just say all of your friends go to one church, and you start suspecting your local minister of being less accurate than others. If you actually don’t trust them, you could either pretend you do and live as such, or be honest and possibly have all of your friends dislike you. You clearly have a strong motivation to believe something specific here, and I think generally incentives trump internal honesty.[1]
On the end part, I don’t think that “hostile talking up” is what the hostile actors want to be seen as doing :) Rather, they would be trying to make it seem like the people previously above them are really below them. To them and their followers, they seem to be at the top of their relevant distribution.
1) There’s been a lot about politics being tribal being discussed recently, and I think it makes a lot of pragmatic sense. link
As I read through, the core model fit well with my intuition. But then I was surprised when I got to the section on religious schisms! I wondered why we should model the adherents of a religion as trying to join the school with the most ‘accurate’ claims about the religion.
On reflection, it appears to me that the model probably holds roughly as well in the religion case as the local radio intellectual case. Both of those are examples of “hostile” talking up. I wonder if the ways in which those cases diverge from pure information sharing explains the difference between humble and hostile.
In particular, perhaps some audiences are looking to reduce cognitive dissonance between their self-image as unbiased on the one hand and their particular beliefs and preferences on the other. That leaves an opening for someone to sell reasonableness/unbiasedness self-image to people holding a given set of beliefs and preferences.
Someone making reasonable counterarguments is a threat to what you’ve offered, and in that case your job is to provide refutation, counterargument and discredit so it is easy for that person’s arguments to be dismissed (through a mixture of claimed flaws in their arguments and claimed flaws in the person promoting them). This would be a ‘hostile’ talking up.
Also, we should probably expect to find it hard to distinguish between some hostile talking ups and overconfident talking downs. If we could always distinguish, hostile talking up is a clear signal of defeat.
Good points;
I would definitely agree that people are generally reluctant to blatantly deceive themselves. There is definitely some cost to incorrect beliefs, though it can vary greatly in magnitude depending on the situation.
For instance, just say all of your friends go to one church, and you start suspecting your local minister of being less accurate than others. If you actually don’t trust them, you could either pretend you do and live as such, or be honest and possibly have all of your friends dislike you. You clearly have a strong motivation to believe something specific here, and I think generally incentives trump internal honesty.[1]
On the end part, I don’t think that “hostile talking up” is what the hostile actors want to be seen as doing :) Rather, they would be trying to make it seem like the people previously above them are really below them. To them and their followers, they seem to be at the top of their relevant distribution.
1) There’s been a lot about politics being tribal being discussed recently, and I think it makes a lot of pragmatic sense. link