it’s not so easy to assess unpleasantness...they WANT there to be a societal breakdown
Well, presumably because it would prove them right all along, not because they enjoy chaos...but it doesn’t hold any explanatory power to say that people feel strong emotions towards certain epistemic questions because certain beliefs are more or less pleasant, and then to turn around and say that the reason a belief is (un)pleasant is that it affirms/contradicts a previously held belief. That’s circular.
The initial idea of saying that there is an “unpleasant truth” in every controversy was to create a theory that had predictive power over which issues people would get emotional over. If we then say that unpleasant truths are those which prove people wrong, we lose predictive power—in LW terms, our theory stops paying rent.
We’d be better off just saying “people don’t like changing their minds” in general, if we’re not going to predict which issues and which conditions will create this sort of emotional stubbornness.
I think a lot of disingenuous people are smart enough to say something like “I wish my position were untrue”
I think it’s important to create a distinction between the satisfaction of having one’s beliefs confirmed vs. actually wishing certain beliefs to be true. They are both sources of bias and mind-kill, but they are very different. The survivalists are presumably feeling satisfaction for the former reason (vindication) when faced with talk of society collapsing, even as they do not feel the latter (true preference for a universe where society collapses).
Well, presumably because it would prove them right all along, not because they enjoy chaos
I’m not so sure about that. Once, after a few drinks, I directly confronted a survivalist about this issue. He basically told me that due to his working class background, he felt locked out of the elite; that if there were a societal breakdown he would have the opportunity to become a high status person.
I would guess that a lot of survivalists have feelings along these lines; that they resent modern society’s power structure and that at some level they wish it would fall apart.
But anyway, I agree you have articulated a problem with Sailer’s hypothesis. You can always find an “unpleasant truth,” particularly if you read “unpleasant truth” to include situations where peoples’ long-held beliefs are wrong. Regardless of whether the underlying beliefs are pleasant or unpleasant.
The initial idea of saying that there is an “unpleasant truth” in every controversy was to create a theory that had predictive power
I’m not sure if that’s the idea, but regardless of whether or not that was the aim, I certainly agree that if the hypothesis lacks predictive power then there’s a good chance it’s worthless.
One can put things a slightly different way: How do you know if people are facing evidence of an uncomfortable truth apart from them getting emotional about it?
I think it’s important to create a distinction between the satisfaction of having one’s beliefs confirmed vs. actually wishing certain beliefs to be true. They are both sources of bias and mind-kill, but they are very different. The survivalists are presumably feeling satisfaction for the former reason (vindication) when faced with talk of society collapsing, even as they do not feel the latter (true preference for a universe where society collapses).
Putting aside my question about survivalists’ preferences, why draw the distinction? Ultimately the effect is the same, no?
why draw the distinction? Ultimately the effect is the same, no?
I don’t think so. To continue the survivalist example—a survivalist who wanted the belief that civilization would collapse to be true would be making villainous plots to cause the collapse. A survivalist who simply wanted to be vindicated but didn’t actually desire collapse would look at the first signs of collapse, tell everyone “I told you so” with a rather smug expression, and then join them in the fight to prevent civilization from collapsing.
How do you know if people are facing evidence of an uncomfortable truth apart from them getting emotional about it?
Being emotional is probably not a good signal of this. For example, plenty of atheists are emotional about religion—that doesn’t mean they are uncomfortably aware that it’s actually true in some corner of their minds. One might be emotional because one believes that people who hold certain viewpoints are damaging society.
I think self deception from uncomfortable truths has some unique tells which are distinct from sheer negative affect. Some of these are discussed in the “belief in belief” articles—to the extent that they can do so without becoming consciously aware of it, the person will basically act as if they believe the uncomfortable truth is true, even while professing that it is false.
I think belief in a good afterlife where we will all be together is the most obvious example of this pattern—most people simply don’t act as if death is nothing more a temporary separation when faced with actual death, regardless of what they profess to believe. At some implicit level, I think most people know that the separation is permanent. (There’s exceptions of course—I’ve seen some particularly strong believers who really were relatively unperturbed in the face of death)
I don’t think so. To continue the survivalist example—a survivalist who wanted the belief that civilization would collapse to be true would be making villainous plots to cause the collapse. A survivalist who simply wanted to be vindicated but didn’t actually desire collapse would look at the first signs of collapse, tell everyone “I told you so” with a rather smug expression, and then join them in the fight to prevent civilization from collapsing.
I disagree with this based on my general observations of survivalists. I haven’t noticed any of them plotting to undermine civilization. Also, I doubt that any of them word do much to prevent a collapse. Also, just introspecting, there are a lot of things I wish were different about the world but I am doing little or nothing to bring about such changes. I think my attitude is pretty common.
Perhaps more importantly, even if what you are saying is correct, how does it relate to the subject at hand—which is predicting which topics will generate a lot of heat in discussion?
Being emotional is probably not a good signal of this. For example, plenty of atheists are emotional about religion—that doesn’t mean they are uncomfortably aware that it’s actually true in some corner of their minds.
I agree that other things can get people worked up besides cognitive dissonance.
Some of these are discussed in the “belief in belief” articles—to the extent that they can do so without becoming consciously aware of it, the person will basically act as if they believe the uncomfortable truth is true, even while professing that it is false.
I like that idea. So one can hypothesize that, at a minimum, in any area where a lot of peoples’ actions are inconsistent with their professed beliefs, then discussion of those beliefs will tend to generate a lot of heat, so to speak. Not sure that covers everything, but it seems like a good start.
There’s exceptions of course—I’ve seen some particularly strong believers who really were relatively unperturbed in the face of death)
And quite possibly those same people remain relatively unperturbed when debating life after death. :)
Well, presumably because it would prove them right all along, not because they enjoy chaos...but it doesn’t hold any explanatory power to say that people feel strong emotions towards certain epistemic questions because certain beliefs are more or less pleasant, and then to turn around and say that the reason a belief is (un)pleasant is that it affirms/contradicts a previously held belief. That’s circular.
The initial idea of saying that there is an “unpleasant truth” in every controversy was to create a theory that had predictive power over which issues people would get emotional over. If we then say that unpleasant truths are those which prove people wrong, we lose predictive power—in LW terms, our theory stops paying rent.
We’d be better off just saying “people don’t like changing their minds” in general, if we’re not going to predict which issues and which conditions will create this sort of emotional stubbornness.
I think it’s important to create a distinction between the satisfaction of having one’s beliefs confirmed vs. actually wishing certain beliefs to be true. They are both sources of bias and mind-kill, but they are very different. The survivalists are presumably feeling satisfaction for the former reason (vindication) when faced with talk of society collapsing, even as they do not feel the latter (true preference for a universe where society collapses).
I’m not so sure about that. Once, after a few drinks, I directly confronted a survivalist about this issue. He basically told me that due to his working class background, he felt locked out of the elite; that if there were a societal breakdown he would have the opportunity to become a high status person.
I would guess that a lot of survivalists have feelings along these lines; that they resent modern society’s power structure and that at some level they wish it would fall apart.
But anyway, I agree you have articulated a problem with Sailer’s hypothesis. You can always find an “unpleasant truth,” particularly if you read “unpleasant truth” to include situations where peoples’ long-held beliefs are wrong. Regardless of whether the underlying beliefs are pleasant or unpleasant.
I’m not sure if that’s the idea, but regardless of whether or not that was the aim, I certainly agree that if the hypothesis lacks predictive power then there’s a good chance it’s worthless.
One can put things a slightly different way: How do you know if people are facing evidence of an uncomfortable truth apart from them getting emotional about it?
Putting aside my question about survivalists’ preferences, why draw the distinction? Ultimately the effect is the same, no?
I don’t think so. To continue the survivalist example—a survivalist who wanted the belief that civilization would collapse to be true would be making villainous plots to cause the collapse. A survivalist who simply wanted to be vindicated but didn’t actually desire collapse would look at the first signs of collapse, tell everyone “I told you so” with a rather smug expression, and then join them in the fight to prevent civilization from collapsing.
Being emotional is probably not a good signal of this. For example, plenty of atheists are emotional about religion—that doesn’t mean they are uncomfortably aware that it’s actually true in some corner of their minds. One might be emotional because one believes that people who hold certain viewpoints are damaging society.
I think self deception from uncomfortable truths has some unique tells which are distinct from sheer negative affect. Some of these are discussed in the “belief in belief” articles—to the extent that they can do so without becoming consciously aware of it, the person will basically act as if they believe the uncomfortable truth is true, even while professing that it is false.
I think belief in a good afterlife where we will all be together is the most obvious example of this pattern—most people simply don’t act as if death is nothing more a temporary separation when faced with actual death, regardless of what they profess to believe. At some implicit level, I think most people know that the separation is permanent. (There’s exceptions of course—I’ve seen some particularly strong believers who really were relatively unperturbed in the face of death)
I disagree with this based on my general observations of survivalists. I haven’t noticed any of them plotting to undermine civilization. Also, I doubt that any of them word do much to prevent a collapse. Also, just introspecting, there are a lot of things I wish were different about the world but I am doing little or nothing to bring about such changes. I think my attitude is pretty common.
Perhaps more importantly, even if what you are saying is correct, how does it relate to the subject at hand—which is predicting which topics will generate a lot of heat in discussion?
I agree that other things can get people worked up besides cognitive dissonance.
I like that idea. So one can hypothesize that, at a minimum, in any area where a lot of peoples’ actions are inconsistent with their professed beliefs, then discussion of those beliefs will tend to generate a lot of heat, so to speak. Not sure that covers everything, but it seems like a good start.
And quite possibly those same people remain relatively unperturbed when debating life after death. :)