That’s a warning sign, not a barbed-wire fence patrolled by guards with orders to shoot to kill.
why not choose any other fringe political belief instead, or try creating a new one from scratch, or whatever?
Neoreaction is an interesting line of thought offering unusual—and so valuable—insights. If you don’t want to talk about NRx, well, don’t. If you want to talk about different political beliefs, well, do.
some way of warning people “you have strayed from the path”
What is “the path”? LW is a diverse community and that’s one of its strengths.
When a rationalist sympathetic to neoreaction reads the SSC neoreaction anti-faq, they should be deeply shocked and start questioning their own sanity. They should realize how much they have failed the art of rationality by not realizing most of that on their own.
You did mention mindkill, didn’t you? I recommend a look in the mirror. In particular, you seem to be confusing rationality with a particular set of political values.
epistemically correct political opinions
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Talking about “neoreaction” (or any other political group) already is a package-deal fallacy. NRs have a set of beliefs. Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually. It is quite possible that within the set, some beliefs will be true, some will be false, and some will be undefined. Then we can accept the true beliefs, and reject the false beliefs. There is no need to use the word “neoreaction” anywhere in that process.
So, instead of having threads about neoreaction, we (assuming we are going to debate politics) should have threads about each individual belief (only one such thread at a time). Then we should provide evidence for the belief or against the belief. Then we should judge the evidence, and come to a conclusion, unconstrained by identity labels.
The fact that we are not already doing it this way, is for me an evidence on the meta level that we are not ready for having political debates.
Debating beliefs separately, understanding the conjuction fallacy, providing evidence, avoiding labels, tabooing words… this is all rationality 101 stuff. This is “the path” we have already strayed from. If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Value is “I don’t want children to starve”. Political opinion is “we should increase the minimal wage (so the children will not starve)”. There is more than the value; there is also the model of the world saying that “increasing minimal wage will reduce the number of starving children (without significant conflict with other values)”. Other person may share the value, but reject the model. They may instead have a model that “increasing minimal wages increases unemployment, and thus increases the number of starving children”, and therefore have a political opinion “we should remove minimal wage (so the children will not starve)”. Same value, different models, different political opinions.
It seems to me that people usually differ more in their models than in their values. There are probably few people who really want to optimize the world to increase the number of starving children, but there are many people with political opinions contradicting each other. (Believing too quickly that our political opponents have different values is also covered in the Sequences.)
Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually.
I don’t think it’s quite that simple.
You are arguing for atomicity of beliefs as well as their independence—you are saying they can (and should) stand and fall on their own. I think the situation is more complicated—the beliefs form a network and accepting or rejecting a particular node sends ripples through the whole network.
Beliefs can support and reinforce each other, they can depend on one another. Some foundational beliefs are so important to the whole network that rejecting them collapses the whole thing. Consider e.g. Christianity—a particular network of beliefs. Some can stand or fall on their own—the proliferation of varieties of Christianity attests to that—but some beliefs support large sub-networks and if you tear them down, the rest falls, too. At the root, if you reject the belief in God, debating, for example, the existence of purgatory is silly.
The package-deal fallacy exists and is real, but excessive reductionism is a fallacy, too, and just as real.
If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Oh, I don’t trust our ability to debate complex things. But debate them we must, because the alternative is much worse. That ability is not a binary flag, by the way.
There is more than the value; there is also the model of the world
True, and these should be separated to the extent possible.
It seems to me that people usually differ more in their models than in their values.
I don’t know about that—I’d like to see more evidence. One of the problems is that people may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as they looked before.
[P]eople may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as the looked before.
Maybe. It seems to me that there could be two systems of political ideas—call them A and B—both of which are pretty credible when taken as wholes, but for which if you take any single proposition from one and examine it in the context of the other, it’s obviously wrong.
(The same thing happens with scientific theories. Key words: “Quine-Duhem thesis”.)
On the other hand, it does also happen that basically-unrelated ideas get bundled together as part of a package deal, and in that case we probably do generally want to try to separate them. So I’m not sure what the best way is to make the tradeoff between splitting and lumping.
That’s a warning sign, not a barbed-wire fence patrolled by guards with orders to shoot to kill.
Neoreaction is an interesting line of thought offering unusual—and so valuable—insights. If you don’t want to talk about NRx, well, don’t. If you want to talk about different political beliefs, well, do.
What is “the path”? LW is a diverse community and that’s one of its strengths.
You did mention mindkill, didn’t you? I recommend a look in the mirror. In particular, you seem to be confusing rationality with a particular set of political values.
Political opinions are expressions of values. Values are not epistemically correct or wrong—that’s a category error.
Talking about “neoreaction” (or any other political group) already is a package-deal fallacy. NRs have a set of beliefs. Each of those beliefs individually can be true or false (or disconnected from evidence). These beliefs should be debated individually. It is quite possible that within the set, some beliefs will be true, some will be false, and some will be undefined. Then we can accept the true beliefs, and reject the false beliefs. There is no need to use the word “neoreaction” anywhere in that process.
So, instead of having threads about neoreaction, we (assuming we are going to debate politics) should have threads about each individual belief (only one such thread at a time). Then we should provide evidence for the belief or against the belief. Then we should judge the evidence, and come to a conclusion, unconstrained by identity labels.
The fact that we are not already doing it this way, is for me an evidence on the meta level that we are not ready for having political debates.
Debating beliefs separately, understanding the conjuction fallacy, providing evidence, avoiding labels, tabooing words… this is all rationality 101 stuff. This is “the path” we have already strayed from. If we collectively fail at rationality 101, I don’t trust our ability to debate more complex things.
Value is “I don’t want children to starve”. Political opinion is “we should increase the minimal wage (so the children will not starve)”. There is more than the value; there is also the model of the world saying that “increasing minimal wage will reduce the number of starving children (without significant conflict with other values)”. Other person may share the value, but reject the model. They may instead have a model that “increasing minimal wages increases unemployment, and thus increases the number of starving children”, and therefore have a political opinion “we should remove minimal wage (so the children will not starve)”. Same value, different models, different political opinions.
It seems to me that people usually differ more in their models than in their values. There are probably few people who really want to optimize the world to increase the number of starving children, but there are many people with political opinions contradicting each other. (Believing too quickly that our political opponents have different values is also covered in the Sequences.)
I don’t think it’s quite that simple.
You are arguing for atomicity of beliefs as well as their independence—you are saying they can (and should) stand and fall on their own. I think the situation is more complicated—the beliefs form a network and accepting or rejecting a particular node sends ripples through the whole network.
Beliefs can support and reinforce each other, they can depend on one another. Some foundational beliefs are so important to the whole network that rejecting them collapses the whole thing. Consider e.g. Christianity—a particular network of beliefs. Some can stand or fall on their own—the proliferation of varieties of Christianity attests to that—but some beliefs support large sub-networks and if you tear them down, the rest falls, too. At the root, if you reject the belief in God, debating, for example, the existence of purgatory is silly.
The package-deal fallacy exists and is real, but excessive reductionism is a fallacy, too, and just as real.
Oh, I don’t trust our ability to debate complex things. But debate them we must, because the alternative is much worse. That ability is not a binary flag, by the way.
True, and these should be separated to the extent possible.
I don’t know about that—I’d like to see more evidence. One of the problems is that people may seem to have the same values at the level of costless declarations (everyone is for motherhood and apple pie), but once the same people are forced to make costly trade-offs between things important to them, the real values come out and I am not sure that they would be as similar as they looked before.
I wish I could give this more than one upvote.
Maybe. It seems to me that there could be two systems of political ideas—call them A and B—both of which are pretty credible when taken as wholes, but for which if you take any single proposition from one and examine it in the context of the other, it’s obviously wrong.
(The same thing happens with scientific theories. Key words: “Quine-Duhem thesis”.)
On the other hand, it does also happen that basically-unrelated ideas get bundled together as part of a package deal, and in that case we probably do generally want to try to separate them. So I’m not sure what the best way is to make the tradeoff between splitting and lumping.