I thought the SSC neoreaction anti-faq was extremely weak. You obviously thought it was extremely strong. We have parsed the same arguments and the same data, yet come out with diametrically opposed conclusions. That’s not how it’s supposed to work.
Well, sometimes that’s exactly how it’s supposed to work.
For example, if you have high confidence in additional information which contradicts the premises of the document in whole or in part, and VB is not confident in that information, then we’d expect you to judge the document less compelling than VB. And if you wished to make a compelling argument that you were justified in that judgment, you could lay out the relevant information.
Or if you’ve performed a more insightful analysis of the document than VB has, such that you’ve identified rhetorical sleight-of-hand in the document that tricks VB into accepting certain lines of reasoning as sound when they actually aren’t, or as supporting certain conclusions when they actually don’t, or something of that nature, here again we’d expect you to judge the document less compelling than VB does, and you could lay out the fallacious reasoning step-by-step if you wished to make a compelling argument that you were justified in that judgment.
I don’t want to focus on the anti-neoreactionary FAQ, because I don’t want to get this dragged into a debate about neoreaction. In particular I simply don’t know how Viliam_Bur parsed the document, what additional information one of us is privy to that the other is not. My point is that this is a general issue in politics, where one group of people finds a piece compelling, and another group finds a piece terrible.
And note too that this isn’t experienced as something emotional or personal, but rather as a general argument for the truth. In this case, VB thinks neo-reactionaries should be “deeply shocked and start questioning their own sanity.” In other words, he thinks this is basically a settled argument, and implies that people who persist in their neoreaction are basically irrational, crazy or something along those lines. Again, this is a general issue in politics. People generally believe (or at least, talk like they believe) that people who disagree with them politically are clinging to refuted beliefs in the face of overwhelming evidence. I don’t just think this is due to epistemic closure, although that is part of it. I think it’s partly an emotional and cultural thing, where we are moved for pre-rational reasons but our minds represent this to us as truth.
I am certainly not saying I am immune from this, but I don’t have the third-party view on myself. I am not saying I am right and Viliam_Bur is wrong on the case in point. But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
To the extent that you’re making a general point—which, if I’ve understood you correctly, is that human intuitions of truth are significantly influenced by emotional and cultural factors, including political (and more broadly tribal) affiliations—I agree with your general point.
And if I’ve understood you correctly, despite the fact that most of your specific claims in this thread are about a specific ideology and a specific document, you don’t actually want to discuss those things. So I won’t.
I’m happy to discuss specifics, just not about the neo reactionary FAQ. I agree with VB that LW has an unhealthy tendency that every discussion becomes about neo reaction, and I don’t like it.
Instead, how about this article. Jim Edwards is a bright guy, and he clearly intended to persuade with that post. And indeed he has plenty of commenters who think he was making a valuable point. Yet I am at a loss to say what it is. Here he is, claiming to have a graph showing that government spending affects economic growth, yet all that graph shows is changes in government spending. It doesn’t show a correlation, it doesn’t suggest causation, it doesn’t do anything of the sort. Yet some poeople find this persuasive.
When someone says they like dance music (for example), I feel like I’m missing out; they get joy out of something I hate, which in some ways makes them better than me, but fundamentally de gusts us non set disputandum. The older I get, the more I feel like that’s how all persuasion works.
Yup, those charts puzzle me, too (based on about five seconds of analysis, admittedly, but I have a strong preexisting belief that there are many examples of such silliness on the Internet, so I’m strongly inclined to agree that this particular chart is yet another example… which is of course yet another example of the kind of judgment-based-on-non-analytical factors we’re discussing).
How confident are you that this is how all persuasion works?
I don’t know how general this is, but I do think it’s an important factor that I don’t see discussed.
Another point is peer effects. I remember at school my physics teacher used to use proof by intimidation where he would attempt to browbeat and ridicule students into agreeing with him on some subtly incorrect argument. And he wouldn’t just get agreement because he scared people, the force of his personality and the desire to not look foolish would genuinely convince them. And then he’d get cross for real, saying no, you need to stand up for yourself, think through the maths. But if you can’t fully think through the soundness of the arguments, if you are groping around both on the correct and the incorrect answer, then you will be swayed by these social effects. I think a lot of persuasion works like that, but on a more subtle and long-term level.
But I don’t think it’s just confirmation bias. People do get won over by arguments. People do change their minds, convert, etc. And often after changing their mind they become just as passionate for their new cause as they ever were for the old. But what is persuasive and what is logical sometimes seem disjoint to different people.
You are right that these things afflict some areas more than others. Politics and religion are notoriously bad. And I do think a large part of it is that people simply have very different standards for what a successful argument looks like, and that this is almost an aesthetic.
Sure, confirmation bias is a force but it’s not an insurmountable force. It only makes changing one’s beliefs difficult, but not impossible.
But what is persuasive and what is logical sometimes seem disjoint to different people.
I agree and I don’t find this surprising. People are different and that’s fine.
Take the classic “Won’t somebody please think of the children!” argument. I, for example, find it deeply suspect to the extent that it works as an anti-argument for me. But not an inconsiderate number of people can be convinced by this (and, in general, by emotional-appeal strategies).
I guess what kind of people are convinced by what kind of arguments would be an interesting area to research.
But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
This is an interesting question that seems empirically testable—we could ask those people and make a poll. Although there is a difference between “believing that NRs are probably right about most things” and “self-identifying as NR”. I would guess there were many people impressed (but not yet completely convinced) by NR without accepting the label (yet?), who were less impressed after reading the FAQ. So the losses among potential NRs were probably much higher than among already fully convinced NRs.
Well, sometimes that’s exactly how it’s supposed to work.
For example, if you have high confidence in additional information which contradicts the premises of the document in whole or in part, and VB is not confident in that information, then we’d expect you to judge the document less compelling than VB. And if you wished to make a compelling argument that you were justified in that judgment, you could lay out the relevant information.
Or if you’ve performed a more insightful analysis of the document than VB has, such that you’ve identified rhetorical sleight-of-hand in the document that tricks VB into accepting certain lines of reasoning as sound when they actually aren’t, or as supporting certain conclusions when they actually don’t, or something of that nature, here again we’d expect you to judge the document less compelling than VB does, and you could lay out the fallacious reasoning step-by-step if you wished to make a compelling argument that you were justified in that judgment.
Do you believe either of those are the case?
I don’t want to focus on the anti-neoreactionary FAQ, because I don’t want to get this dragged into a debate about neoreaction. In particular I simply don’t know how Viliam_Bur parsed the document, what additional information one of us is privy to that the other is not. My point is that this is a general issue in politics, where one group of people finds a piece compelling, and another group finds a piece terrible.
And note too that this isn’t experienced as something emotional or personal, but rather as a general argument for the truth. In this case, VB thinks neo-reactionaries should be “deeply shocked and start questioning their own sanity.” In other words, he thinks this is basically a settled argument, and implies that people who persist in their neoreaction are basically irrational, crazy or something along those lines. Again, this is a general issue in politics. People generally believe (or at least, talk like they believe) that people who disagree with them politically are clinging to refuted beliefs in the face of overwhelming evidence. I don’t just think this is due to epistemic closure, although that is part of it. I think it’s partly an emotional and cultural thing, where we are moved for pre-rational reasons but our minds represent this to us as truth.
I am certainly not saying I am immune from this, but I don’t have the third-party view on myself. I am not saying I am right and Viliam_Bur is wrong on the case in point. But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
To the extent that you’re making a general point—which, if I’ve understood you correctly, is that human intuitions of truth are significantly influenced by emotional and cultural factors, including political (and more broadly tribal) affiliations—I agree with your general point.
And if I’ve understood you correctly, despite the fact that most of your specific claims in this thread are about a specific ideology and a specific document, you don’t actually want to discuss those things. So I won’t.
I’m happy to discuss specifics, just not about the neo reactionary FAQ. I agree with VB that LW has an unhealthy tendency that every discussion becomes about neo reaction, and I don’t like it.
Instead, how about this article. Jim Edwards is a bright guy, and he clearly intended to persuade with that post. And indeed he has plenty of commenters who think he was making a valuable point. Yet I am at a loss to say what it is. Here he is, claiming to have a graph showing that government spending affects economic growth, yet all that graph shows is changes in government spending. It doesn’t show a correlation, it doesn’t suggest causation, it doesn’t do anything of the sort. Yet some poeople find this persuasive.
When someone says they like dance music (for example), I feel like I’m missing out; they get joy out of something I hate, which in some ways makes them better than me, but fundamentally de gusts us non set disputandum. The older I get, the more I feel like that’s how all persuasion works.
Yup, those charts puzzle me, too (based on about five seconds of analysis, admittedly, but I have a strong preexisting belief that there are many examples of such silliness on the Internet, so I’m strongly inclined to agree that this particular chart is yet another example… which is of course yet another example of the kind of judgment-based-on-non-analytical factors we’re discussing).
How confident are you that this is how all persuasion works?
I don’t know how general this is, but I do think it’s an important factor that I don’t see discussed.
Another point is peer effects. I remember at school my physics teacher used to use proof by intimidation where he would attempt to browbeat and ridicule students into agreeing with him on some subtly incorrect argument. And he wouldn’t just get agreement because he scared people, the force of his personality and the desire to not look foolish would genuinely convince them. And then he’d get cross for real, saying no, you need to stand up for yourself, think through the maths. But if you can’t fully think through the soundness of the arguments, if you are groping around both on the correct and the incorrect answer, then you will be swayed by these social effects. I think a lot of persuasion works like that, but on a more subtle and long-term level.
Yes, I agree.
That’s kinda a general issue in humans and usually goes by the name of Confirmation Bias.
For example, debates about religion or, say, global warming work in exactly the same way.
But I don’t think it’s just confirmation bias. People do get won over by arguments. People do change their minds, convert, etc. And often after changing their mind they become just as passionate for their new cause as they ever were for the old. But what is persuasive and what is logical sometimes seem disjoint to different people.
You are right that these things afflict some areas more than others. Politics and religion are notoriously bad. And I do think a large part of it is that people simply have very different standards for what a successful argument looks like, and that this is almost an aesthetic.
Sure, confirmation bias is a force but it’s not an insurmountable force. It only makes changing one’s beliefs difficult, but not impossible.
I agree and I don’t find this surprising. People are different and that’s fine.
Take the classic “Won’t somebody please think of the children!” argument. I, for example, find it deeply suspect to the extent that it works as an anti-argument for me. But not an inconsiderate number of people can be convinced by this (and, in general, by emotional-appeal strategies).
I guess what kind of people are convinced by what kind of arguments would be an interesting area to research.
This is an interesting question that seems empirically testable—we could ask those people and make a poll. Although there is a difference between “believing that NRs are probably right about most things” and “self-identifying as NR”. I would guess there were many people impressed (but not yet completely convinced) by NR without accepting the label (yet?), who were less impressed after reading the FAQ. So the losses among potential NRs were probably much higher than among already fully convinced NRs.