less people seemed be on the fence than I expected, “the distribution of opinions about neoreaction” seemed bimodal
I suspect this is the polarizing effect of politics, not something specific for LW nor specific for neoreaction. We are talking about labels, not ideas. I may agree with half of ideas of some movement, and disagree with other half of ideas, but I usually have a clear opinion about whether I want to identify with a label or not.
I understand that LessWrong consists of real people, but when I think about LessWrong, the mental image that comes to my mind is that of a place, abstract entity and not a community of people.
My mental image for LW community is more or less “people who have read the Sequences, and in general agree with them”. Yes, I am aware that in recent years many people ignore this stuff, to the degree where mentioning the Sequences is a minor faux pas. (And for a while it was a major faux pas, and some people loudly insisted that telling someone to read the Sequences is a lesswrongeese for “fuck you”. Not sure how much of that attitude actually came from the “Rational”Wiki.) That, in my opinion, is a bad thing, and it sometimes leads to reinventing the wheel in the debates. To put it shortly, it seems to me we have lost the ability to build new things, and became an online debate club. Still a high quality online debate club. Just not what I hoped for at the beginning.
What I am trying to say is that when I see neoreactionaries commenting on LessWrong, I do not perceive them as “them” if they talk in a manner that is close enough to LessWrong style about the topics that are LW topics.
LessWrong was built upon some ideas, and one of them was that “politics is the mindkiller” and that we strive to become more rational, instead of being merely clever arguers. At this moment, neoreactionaries are the group most visibly violating this rule. They strongly contribute to the destruction of the walled garden. Debating them over and over again is privileging a hypothesis; why not choose any other fringe political belief instead, or try creating a new one from scratch, or whatever?
And I guess that if we are to overcome biases we will have to deal with politics.
Politics is an advanced topic for a rationalist. Before going there, one should make sure they are able to handle the easier situations first. Also, there should be some kind of feedback, some way of warning people “you have strayed from the path”. Otherwise we will only have clever arguers competing using their verbal skills. When a rationalist sympathetic to neoreaction reads the SSC neoreaction anti-faq, they should be deeply shocked and start questioning their own sanity. They should realize how much they have failed the art of rationality by not realizing most of that on their own. They should update about their own ability to form epistemically correct political opinions. Instead of inventing clever rationalizations for the already written bottom line.
In my opinion, Yvain is the most qualified person for the task of debating politics rationally, and the only obvious improvement would be to somehow find dozen different Yvains coming from different cultural backgrounds, and let them debate with each other. But one doesn’t get there by writing their bottom line first.
To put it shortly, it seems to me we have lost the ability to build new things, and became an online debate club.
Did LW as a group ever have this ability? Going by the archives it seems that there were a small number (less than 10) of posters on LW who could do this. Now that they’re no longer posting regularly, new things are no longer produced here.
try creating a new one from scratch, or whatever?
A reasonable case could be made that this is how NRx came to be.
A reasonable case could be made that this is how NRx came to be.
If this is where NRx came from, then I am strongly reminded of the story of the dog that evolved into a bacterium. An alternative LW-like community that evolved into an aggresive political movement? Either everyone involved was an advanced hyper-genius or something went terribly wrong somewhere along the way. That’s not to say that something valuable did not result, but “mission drift” would be a very mild phrase.
Show me that movement in actual politics. Is any NRx-er running for office? Do they have an influential PAC? A think tank in Washington, some lobbyists, maybe?
Oh, I think we’re using the phrase “political movement” in different senses. I meant something more like “group of people who define themselves as a group in terms of a relatively stable platform of shared political beliefs, which are sufficiently different from the political beliefs of any other group or movement”. Other examples might be libertarianism, anarcho-primitivism, internet social justice, etc.
I guess this is a non-standard usage, so I’m open to recommendations for a better term.
Yep, looks like we are using different terminology. The distinction between political philosophy and political movement that I drew is precisely the difference between staying in the ideas/information/talking/discussing realm and moving out into the realm of real-world power and power relationships. What matches your definition I’d probably call a line of political thought.
Mencius Moldbug is a political philosopher. Tea Party is a political movement.
When a rationalist sympathetic to neoreaction reads the SSC neoreaction anti-faq, they should be deeply shocked and start questioning their own sanity
Sentiments like this are, in my opinion, a large part of why “politics is the mind-killer.” I am no neoreactionary, but I thought the SSC neoreaction anti-faq was extremely weak. You obviously thought it was extremely strong. We have parsed the same arguments and the same data, yet come out with diametrically opposed conclusions. That’s not how it’s supposed to work. And this is far from a unique occurrence. I frequently find the same article or post being held up as brilliant by people on one side of the political spectrum, and dishonest or idiotic by people on the other side.
It is not merely that people don’t agree on what’s correct, we don’t even agree on what a successful argument looks like.
I thought the SSC neoreaction anti-faq was extremely weak. You obviously thought it was extremely strong. We have parsed the same arguments and the same data, yet come out with diametrically opposed conclusions. That’s not how it’s supposed to work.
Well, sometimes that’s exactly how it’s supposed to work.
For example, if you have high confidence in additional information which contradicts the premises of the document in whole or in part, and VB is not confident in that information, then we’d expect you to judge the document less compelling than VB. And if you wished to make a compelling argument that you were justified in that judgment, you could lay out the relevant information.
Or if you’ve performed a more insightful analysis of the document than VB has, such that you’ve identified rhetorical sleight-of-hand in the document that tricks VB into accepting certain lines of reasoning as sound when they actually aren’t, or as supporting certain conclusions when they actually don’t, or something of that nature, here again we’d expect you to judge the document less compelling than VB does, and you could lay out the fallacious reasoning step-by-step if you wished to make a compelling argument that you were justified in that judgment.
I don’t want to focus on the anti-neoreactionary FAQ, because I don’t want to get this dragged into a debate about neoreaction. In particular I simply don’t know how Viliam_Bur parsed the document, what additional information one of us is privy to that the other is not. My point is that this is a general issue in politics, where one group of people finds a piece compelling, and another group finds a piece terrible.
And note too that this isn’t experienced as something emotional or personal, but rather as a general argument for the truth. In this case, VB thinks neo-reactionaries should be “deeply shocked and start questioning their own sanity.” In other words, he thinks this is basically a settled argument, and implies that people who persist in their neoreaction are basically irrational, crazy or something along those lines. Again, this is a general issue in politics. People generally believe (or at least, talk like they believe) that people who disagree with them politically are clinging to refuted beliefs in the face of overwhelming evidence. I don’t just think this is due to epistemic closure, although that is part of it. I think it’s partly an emotional and cultural thing, where we are moved for pre-rational reasons but our minds represent this to us as truth.
I am certainly not saying I am immune from this, but I don’t have the third-party view on myself. I am not saying I am right and Viliam_Bur is wrong on the case in point. But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
To the extent that you’re making a general point—which, if I’ve understood you correctly, is that human intuitions of truth are significantly influenced by emotional and cultural factors, including political (and more broadly tribal) affiliations—I agree with your general point.
And if I’ve understood you correctly, despite the fact that most of your specific claims in this thread are about a specific ideology and a specific document, you don’t actually want to discuss those things. So I won’t.
I’m happy to discuss specifics, just not about the neo reactionary FAQ. I agree with VB that LW has an unhealthy tendency that every discussion becomes about neo reaction, and I don’t like it.
Instead, how about this article. Jim Edwards is a bright guy, and he clearly intended to persuade with that post. And indeed he has plenty of commenters who think he was making a valuable point. Yet I am at a loss to say what it is. Here he is, claiming to have a graph showing that government spending affects economic growth, yet all that graph shows is changes in government spending. It doesn’t show a correlation, it doesn’t suggest causation, it doesn’t do anything of the sort. Yet some poeople find this persuasive.
When someone says they like dance music (for example), I feel like I’m missing out; they get joy out of something I hate, which in some ways makes them better than me, but fundamentally de gusts us non set disputandum. The older I get, the more I feel like that’s how all persuasion works.
Yup, those charts puzzle me, too (based on about five seconds of analysis, admittedly, but I have a strong preexisting belief that there are many examples of such silliness on the Internet, so I’m strongly inclined to agree that this particular chart is yet another example… which is of course yet another example of the kind of judgment-based-on-non-analytical factors we’re discussing).
How confident are you that this is how all persuasion works?
I don’t know how general this is, but I do think it’s an important factor that I don’t see discussed.
Another point is peer effects. I remember at school my physics teacher used to use proof by intimidation where he would attempt to browbeat and ridicule students into agreeing with him on some subtly incorrect argument. And he wouldn’t just get agreement because he scared people, the force of his personality and the desire to not look foolish would genuinely convince them. And then he’d get cross for real, saying no, you need to stand up for yourself, think through the maths. But if you can’t fully think through the soundness of the arguments, if you are groping around both on the correct and the incorrect answer, then you will be swayed by these social effects. I think a lot of persuasion works like that, but on a more subtle and long-term level.
But I don’t think it’s just confirmation bias. People do get won over by arguments. People do change their minds, convert, etc. And often after changing their mind they become just as passionate for their new cause as they ever were for the old. But what is persuasive and what is logical sometimes seem disjoint to different people.
You are right that these things afflict some areas more than others. Politics and religion are notoriously bad. And I do think a large part of it is that people simply have very different standards for what a successful argument looks like, and that this is almost an aesthetic.
Sure, confirmation bias is a force but it’s not an insurmountable force. It only makes changing one’s beliefs difficult, but not impossible.
But what is persuasive and what is logical sometimes seem disjoint to different people.
I agree and I don’t find this surprising. People are different and that’s fine.
Take the classic “Won’t somebody please think of the children!” argument. I, for example, find it deeply suspect to the extent that it works as an anti-argument for me. But not an inconsiderate number of people can be convinced by this (and, in general, by emotional-appeal strategies).
I guess what kind of people are convinced by what kind of arguments would be an interesting area to research.
But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
This is an interesting question that seems empirically testable—we could ask those people and make a poll. Although there is a difference between “believing that NRs are probably right about most things” and “self-identifying as NR”. I would guess there were many people impressed (but not yet completely convinced) by NR without accepting the label (yet?), who were less impressed after reading the FAQ. So the losses among potential NRs were probably much higher than among already fully convinced NRs.
That’s a warning sign, not a barbed-wire fence patrolled by guards with orders to shoot to kill.
why not choose any other fringe political belief instead, or try creating a new one from scratch, or whatever?
Neoreaction is an interesting line of thought offering unusual—and so valuable—insights. If you don’t want to talk about NRx, well, don’t. If you want to talk about different political beliefs, well, do.
some way of warning people “you have strayed from the path”
What is “the path”? LW is a diverse community and that’s one of its strengths.
When a rationalist sympathetic to neoreaction reads the SSC neoreaction anti-faq, they should be deeply shocked and start questioning their own sanity. They should realize how much they have failed the art of rationality by not realizing most of that on their own.
You did mention mindkill, didn’t you? I recommend a look in the mirror. In particular, you seem to be confusing rationality with a particular set of political values.
epistemically correct political opinions
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Talking about “neoreaction” (or any other political group) already is a package-deal fallacy. NRs have a set of beliefs. Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually. It is quite possible that within the set, some beliefs will be true, some will be false, and some will be undefined. Then we can accept the true beliefs, and reject the false beliefs. There is no need to use the word “neoreaction” anywhere in that process.
So, instead of having threads about neoreaction, we (assuming we are going to debate politics) should have threads about each individual belief (only one such thread at a time). Then we should provide evidence for the belief or against the belief. Then we should judge the evidence, and come to a conclusion, unconstrained by identity labels.
The fact that we are not already doing it this way, is for me an evidence on the meta level that we are not ready for having political debates.
Debating beliefs separately, understanding the conjuction fallacy, providing evidence, avoiding labels, tabooing words… this is all rationality 101 stuff. This is “the path” we have already strayed from. If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Value is “I don’t want children to starve”. Political opinion is “we should increase the minimal wage (so the children will not starve)”. There is more than the value; there is also the model of the world saying that “increasing minimal wage will reduce the number of starving children (without significant conflict with other values)”. Other person may share the value, but reject the model. They may instead have a model that “increasing minimal wages increases unemployment, and thus increases the number of starving children”, and therefore have a political opinion “we should remove minimal wage (so the children will not starve)”. Same value, different models, different political opinions.
It seems to me that people usually differ more in their models than in their values. There are probably few people who really want to optimize the world to increase the number of starving children, but there are many people with political opinions contradicting each other. (Believing too quickly that our political opponents have different values is also covered in the Sequences.)
Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually.
I don’t think it’s quite that simple.
You are arguing for atomicity of beliefs as well as their independence—you are saying they can (and should) stand and fall on their own. I think the situation is more complicated—the beliefs form a network and accepting or rejecting a particular node sends ripples through the whole network.
Beliefs can support and reinforce each other, they can depend on one another. Some foundational beliefs are so important to the whole network that rejecting them collapses the whole thing. Consider e.g. Christianity—a particular network of beliefs. Some can stand or fall on their own—the proliferation of varieties of Christianity attests to that—but some beliefs support large sub-networks and if you tear them down, the rest falls, too. At the root, if you reject the belief in God, debating, for example, the existence of purgatory is silly.
The package-deal fallacy exists and is real, but excessive reductionism is a fallacy, too, and just as real.
If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Oh, I don’t trust our ability to debate complex things. But debate them we must, because the alternative is much worse. That ability is not a binary flag, by the way.
There is more than the value; there is also the model of the world
True, and these should be separated to the extent possible.
It seems to me that people usually differ more in their models than in their values.
I don’t know about that—I’d like to see more evidence. One of the problems is that people may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as they looked before.
[P]eople may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as the looked before.
Maybe. It seems to me that there could be two systems of political ideas—call them A and B—both of which are pretty credible when taken as wholes, but for which if you take any single proposition from one and examine it in the context of the other, it’s obviously wrong.
(The same thing happens with scientific theories. Key words: “Quine-Duhem thesis”.)
On the other hand, it does also happen that basically-unrelated ideas get bundled together as part of a package deal, and in that case we probably do generally want to try to separate them. So I’m not sure what the best way is to make the tradeoff between splitting and lumping.
I suspect this is the polarizing effect of politics, not something specific for LW nor specific for neoreaction. We are talking about labels, not ideas. I may agree with half of ideas of some movement, and disagree with other half of ideas, but I usually have a clear opinion about whether I want to identify with a label or not.
My mental image for LW community is more or less “people who have read the Sequences, and in general agree with them”. Yes, I am aware that in recent years many people ignore this stuff, to the degree where mentioning the Sequences is a minor faux pas. (And for a while it was a major faux pas, and some people loudly insisted that telling someone to read the Sequences is a lesswrongeese for “fuck you”. Not sure how much of that attitude actually came from the “Rational”Wiki.) That, in my opinion, is a bad thing, and it sometimes leads to reinventing the wheel in the debates. To put it shortly, it seems to me we have lost the ability to build new things, and became an online debate club. Still a high quality online debate club. Just not what I hoped for at the beginning.
LessWrong was built upon some ideas, and one of them was that “politics is the mindkiller” and that we strive to become more rational, instead of being merely clever arguers. At this moment, neoreactionaries are the group most visibly violating this rule. They strongly contribute to the destruction of the walled garden. Debating them over and over again is privileging a hypothesis; why not choose any other fringe political belief instead, or try creating a new one from scratch, or whatever?
Politics is an advanced topic for a rationalist. Before going there, one should make sure they are able to handle the easier situations first. Also, there should be some kind of feedback, some way of warning people “you have strayed from the path”. Otherwise we will only have clever arguers competing using their verbal skills. When a rationalist sympathetic to neoreaction reads the SSC neoreaction anti-faq, they should be deeply shocked and start questioning their own sanity. They should realize how much they have failed the art of rationality by not realizing most of that on their own. They should update about their own ability to form epistemically correct political opinions. Instead of inventing clever rationalizations for the already written bottom line.
In my opinion, Yvain is the most qualified person for the task of debating politics rationally, and the only obvious improvement would be to somehow find dozen different Yvains coming from different cultural backgrounds, and let them debate with each other. But one doesn’t get there by writing their bottom line first.
Did LW as a group ever have this ability? Going by the archives it seems that there were a small number (less than 10) of posters on LW who could do this. Now that they’re no longer posting regularly, new things are no longer produced here.
A reasonable case could be made that this is how NRx came to be.
If this is where NRx came from, then I am strongly reminded of the story of the dog that evolved into a bacterium. An alternative LW-like community that evolved into an aggresive political movement? Either everyone involved was an advanced hyper-genius or something went terribly wrong somewhere along the way. That’s not to say that something valuable did not result, but “mission drift” would be a very mild phrase.
As far as I can see it evolved into mostly smart people writing dense texts about political philosophy. That’s a bit different :-)
That would describe quite a few political movements, actually—it’s hardly exclusive to NRx.
Nope, political movements and political philosophy belong to different categories.
Some political movements evolve out of political philosophy texts, but not all political philosophy texts evolve into political movements.
I think that at this point it would be fair to say that a movement has developed out of NRx political philosophy.
Show me that movement in actual politics. Is any NRx-er running for office? Do they have an influential PAC? A think tank in Washington, some lobbyists, maybe?
Nah, man. Once you get to that level of politics, you’re already pozzed.
Oh, I think we’re using the phrase “political movement” in different senses. I meant something more like “group of people who define themselves as a group in terms of a relatively stable platform of shared political beliefs, which are sufficiently different from the political beliefs of any other group or movement”. Other examples might be libertarianism, anarcho-primitivism, internet social justice, etc.
I guess this is a non-standard usage, so I’m open to recommendations for a better term.
Yep, looks like we are using different terminology. The distinction between political philosophy and political movement that I drew is precisely the difference between staying in the ideas/information/talking/discussing realm and moving out into the realm of real-world power and power relationships. What matches your definition I’d probably call a line of political thought.
Mencius Moldbug is a political philosopher. Tea Party is a political movement.
Sentiments like this are, in my opinion, a large part of why “politics is the mind-killer.” I am no neoreactionary, but I thought the SSC neoreaction anti-faq was extremely weak. You obviously thought it was extremely strong. We have parsed the same arguments and the same data, yet come out with diametrically opposed conclusions. That’s not how it’s supposed to work. And this is far from a unique occurrence. I frequently find the same article or post being held up as brilliant by people on one side of the political spectrum, and dishonest or idiotic by people on the other side.
It is not merely that people don’t agree on what’s correct, we don’t even agree on what a successful argument looks like.
Well, sometimes that’s exactly how it’s supposed to work.
For example, if you have high confidence in additional information which contradicts the premises of the document in whole or in part, and VB is not confident in that information, then we’d expect you to judge the document less compelling than VB. And if you wished to make a compelling argument that you were justified in that judgment, you could lay out the relevant information.
Or if you’ve performed a more insightful analysis of the document than VB has, such that you’ve identified rhetorical sleight-of-hand in the document that tricks VB into accepting certain lines of reasoning as sound when they actually aren’t, or as supporting certain conclusions when they actually don’t, or something of that nature, here again we’d expect you to judge the document less compelling than VB does, and you could lay out the fallacious reasoning step-by-step if you wished to make a compelling argument that you were justified in that judgment.
Do you believe either of those are the case?
I don’t want to focus on the anti-neoreactionary FAQ, because I don’t want to get this dragged into a debate about neoreaction. In particular I simply don’t know how Viliam_Bur parsed the document, what additional information one of us is privy to that the other is not. My point is that this is a general issue in politics, where one group of people finds a piece compelling, and another group finds a piece terrible.
And note too that this isn’t experienced as something emotional or personal, but rather as a general argument for the truth. In this case, VB thinks neo-reactionaries should be “deeply shocked and start questioning their own sanity.” In other words, he thinks this is basically a settled argument, and implies that people who persist in their neoreaction are basically irrational, crazy or something along those lines. Again, this is a general issue in politics. People generally believe (or at least, talk like they believe) that people who disagree with them politically are clinging to refuted beliefs in the face of overwhelming evidence. I don’t just think this is due to epistemic closure, although that is part of it. I think it’s partly an emotional and cultural thing, where we are moved for pre-rational reasons but our minds represent this to us as truth.
I am certainly not saying I am immune from this, but I don’t have the third-party view on myself. I am not saying I am right and Viliam_Bur is wrong on the case in point. But I do wonder how many neoreactionaries have been deconverted by that FAQ. I suspect the number is very low...
To the extent that you’re making a general point—which, if I’ve understood you correctly, is that human intuitions of truth are significantly influenced by emotional and cultural factors, including political (and more broadly tribal) affiliations—I agree with your general point.
And if I’ve understood you correctly, despite the fact that most of your specific claims in this thread are about a specific ideology and a specific document, you don’t actually want to discuss those things. So I won’t.
I’m happy to discuss specifics, just not about the neo reactionary FAQ. I agree with VB that LW has an unhealthy tendency that every discussion becomes about neo reaction, and I don’t like it.
Instead, how about this article. Jim Edwards is a bright guy, and he clearly intended to persuade with that post. And indeed he has plenty of commenters who think he was making a valuable point. Yet I am at a loss to say what it is. Here he is, claiming to have a graph showing that government spending affects economic growth, yet all that graph shows is changes in government spending. It doesn’t show a correlation, it doesn’t suggest causation, it doesn’t do anything of the sort. Yet some poeople find this persuasive.
When someone says they like dance music (for example), I feel like I’m missing out; they get joy out of something I hate, which in some ways makes them better than me, but fundamentally de gusts us non set disputandum. The older I get, the more I feel like that’s how all persuasion works.
Yup, those charts puzzle me, too (based on about five seconds of analysis, admittedly, but I have a strong preexisting belief that there are many examples of such silliness on the Internet, so I’m strongly inclined to agree that this particular chart is yet another example… which is of course yet another example of the kind of judgment-based-on-non-analytical factors we’re discussing).
How confident are you that this is how all persuasion works?
I don’t know how general this is, but I do think it’s an important factor that I don’t see discussed.
Another point is peer effects. I remember at school my physics teacher used to use proof by intimidation where he would attempt to browbeat and ridicule students into agreeing with him on some subtly incorrect argument. And he wouldn’t just get agreement because he scared people, the force of his personality and the desire to not look foolish would genuinely convince them. And then he’d get cross for real, saying no, you need to stand up for yourself, think through the maths. But if you can’t fully think through the soundness of the arguments, if you are groping around both on the correct and the incorrect answer, then you will be swayed by these social effects. I think a lot of persuasion works like that, but on a more subtle and long-term level.
Yes, I agree.
That’s kinda a general issue in humans and usually goes by the name of Confirmation Bias.
For example, debates about religion or, say, global warming work in exactly the same way.
But I don’t think it’s just confirmation bias. People do get won over by arguments. People do change their minds, convert, etc. And often after changing their mind they become just as passionate for their new cause as they ever were for the old. But what is persuasive and what is logical sometimes seem disjoint to different people.
You are right that these things afflict some areas more than others. Politics and religion are notoriously bad. And I do think a large part of it is that people simply have very different standards for what a successful argument looks like, and that this is almost an aesthetic.
Sure, confirmation bias is a force but it’s not an insurmountable force. It only makes changing one’s beliefs difficult, but not impossible.
I agree and I don’t find this surprising. People are different and that’s fine.
Take the classic “Won’t somebody please think of the children!” argument. I, for example, find it deeply suspect to the extent that it works as an anti-argument for me. But not an inconsiderate number of people can be convinced by this (and, in general, by emotional-appeal strategies).
I guess what kind of people are convinced by what kind of arguments would be an interesting area to research.
This is an interesting question that seems empirically testable—we could ask those people and make a poll. Although there is a difference between “believing that NRs are probably right about most things” and “self-identifying as NR”. I would guess there were many people impressed (but not yet completely convinced) by NR without accepting the label (yet?), who were less impressed after reading the FAQ. So the losses among potential NRs were probably much higher than among already fully convinced NRs.
That’s a warning sign, not a barbed-wire fence patrolled by guards with orders to shoot to kill.
Neoreaction is an interesting line of thought offering unusual—and so valuable—insights. If you don’t want to talk about NRx, well, don’t. If you want to talk about different political beliefs, well, do.
What is “the path”? LW is a diverse community and that’s one of its strengths.
You did mention mindkill, didn’t you? I recommend a look in the mirror. In particular, you seem to be confusing rationality with a particular set of political values.
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Talking about “neoreaction” (or any other political group) already is a package-deal fallacy. NRs have a set of beliefs. Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually. It is quite possible that within the set, some beliefs will be true, some will be false, and some will be undefined. Then we can accept the true beliefs, and reject the false beliefs. There is no need to use the word “neoreaction” anywhere in that process.
So, instead of having threads about neoreaction, we (assuming we are going to debate politics) should have threads about each individual belief (only one such thread at a time). Then we should provide evidence for the belief or against the belief. Then we should judge the evidence, and come to a conclusion, unconstrained by identity labels.
The fact that we are not already doing it this way, is for me an evidence on the meta level that we are not ready for having political debates.
Debating beliefs separately, understanding the conjuction fallacy, providing evidence, avoiding labels, tabooing words… this is all rationality 101 stuff. This is “the path” we have already strayed from. If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Value is “I don’t want children to starve”. Political opinion is “we should increase the minimal wage (so the children will not starve)”. There is more than the value; there is also the model of the world saying that “increasing minimal wage will reduce the number of starving children (without significant conflict with other values)”. Other person may share the value, but reject the model. They may instead have a model that “increasing minimal wages increases unemployment, and thus increases the number of starving children”, and therefore have a political opinion “we should remove minimal wage (so the children will not starve)”. Same value, different models, different political opinions.
It seems to me that people usually differ more in their models than in their values. There are probably few people who really want to optimize the world to increase the number of starving children, but there are many people with political opinions contradicting each other. (Believing too quickly that our political opponents have different values is also covered in the Sequences.)
I don’t think it’s quite that simple.
You are arguing for atomicity of beliefs as well as their independence—you are saying they can (and should) stand and fall on their own. I think the situation is more complicated—the beliefs form a network and accepting or rejecting a particular node sends ripples through the whole network.
Beliefs can support and reinforce each other, they can depend on one another. Some foundational beliefs are so important to the whole network that rejecting them collapses the whole thing. Consider e.g. Christianity—a particular network of beliefs. Some can stand or fall on their own—the proliferation of varieties of Christianity attests to that—but some beliefs support large sub-networks and if you tear them down, the rest falls, too. At the root, if you reject the belief in God, debating, for example, the existence of purgatory is silly.
The package-deal fallacy exists and is real, but excessive reductionism is a fallacy, too, and just as real.
Oh, I don’t trust our ability to debate complex things. But debate them we must, because the alternative is much worse. That ability is not a binary flag, by the way.
True, and these should be separated to the extent possible.
I don’t know about that—I’d like to see more evidence. One of the problems is that people may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as they looked before.
I wish I could give this more than one upvote.
Maybe. It seems to me that there could be two systems of political ideas—call them A and B—both of which are pretty credible when taken as wholes, but for which if you take any single proposition from one and examine it in the context of the other, it’s obviously wrong.
(The same thing happens with scientific theories. Key words: “Quine-Duhem thesis”.)
On the other hand, it does also happen that basically-unrelated ideas get bundled together as part of a package deal, and in that case we probably do generally want to try to separate them. So I’m not sure what the best way is to make the tradeoff between splitting and lumping.