So you’re claiming that religion (aka team green) is so bad and irrational that any analogy of rationalism (aka team blue) with it is dangerous and sabotage? Or that any positive talk of team green is a threat?
It seems to me that the LW (over)reaction to the irrationality of religion is pretty irrational and has nothing to do with ‘clarity’. If you’re rejecting apriori a line of inquiry because it seems threatening to the in-group, I don’t consider that “rational.”
Edit: This was an overly antagonistic response, in part due to an uncharitable reading of Vladimir_Nesov’s comment.
No, he is making a different and more precise claim.
There is a phenomenon that can be called “anti-epistemology.” This is a set of social forces that penalize or otherwise impede clear thought and speech.
Sometimes, a certain topic in a certain space is free of anti-epistemology. It is relatively easy to think about, research, and discuss it clearly. A central example would be the subject of linear algebra in the context of a class on linear algebra.
Other times, anti-epistemology makes thought, research, and discussion difficult for a certain topic in a certain context. A central example here would be the topic of market economics in a Maoist prison camp.
Unfortunately, what unites both of these cases is that they are so overt. The linear algebra class is set up with the explicit goal of supporting the study of linear algebra; the Maoist prison camp with the explicit purpose of suppressing market economics (among other ideas).
Less crushing than the Maoist prison camp, but still pernicious, are settings in which a certain topic is suppressed implicitly, by a semi-spoken or unspoken set of social monitoring, implicit loyalty tests, and softly enforced or reflexive analogies and framings. This is an anti-epistemology that we might encounter in our everyday lives. You may happen to be so blessed as to be entirely free of such social dynamics, or so incredibly brave as to be entirely free, or even blissfully unaware, that such pressures could exist. But for others, this is a routine struggle.
The claim that Vladimir Nesov is making is that the way such “soft suppressions” get set up in the first place is by establishing analogies. To that, I would add framings and choices of words. For example, if you wish to undermine support for market economics, start by referring to it as “capitalism.” If you wish to undermine support for new development, refer to it as “gentrification.” If you wish to undermine support for social justice, refer to it as “cancel culture.” If you’re an American who wishes to undermine support for the 2nd amendment, refer to action movies as “shoot ’em up movies.”
Then you can start with the analogies. In your case, if you wish to undermine rationality, you might start by making an analogy with religion. It’s a very common reflex. Social justice, capitalism, gun rights, sexual promiscuity, free speech, nationalism, and more have all been referred to many, many times as “religions” by their political opponents in one editorial after another.
Analogies aren’t inherently bad. They can be useful to pick out a particular feature of a confusing thing and make it familiar to a novice by comparison with a familiar thing. I am a biomedical engineering student. If I wanted to explain what a hydrogel is to you, I might say that it’s like snot. That’s an analogy, and it’s not anti-epistemology. I have a particular thing I want you to understand, and I choose a familiar example to help you get there.
But this has many properties in common with cherry-picking. You’re selecting a particular feature, taken out of context, and focusing attention on it. You’re asking for trust, putting yourself in a teaching role, conveying a picture of something with which your audience is unfamiliar to them, and putting a memory and association in their mind. For this reason, analogies can be effective ways to undermine support for a thing by making emotionally loaded but fundamentally lazy, misleading, narrow, or otherwise flawed analogies.
You are quite new to LessWrong, and will have to make up your own mind about the ideas you find here. Right now, you seem to be in a tense place—both viewing the site and its participants as irrationally religious in their denunciation of religion, and yet simultaneously feeling motivated to engage with it anyway.
My suggestion to you is to assess your own motives. Are you interested in what you find here, and do you think there’s a good chance that you are the one with something to learn? If so, then consider that when you find yourself tempted to reflexively dismiss or unfavorably analogize what you find here.
If not, then consider simply stopping your participation. I say this not to be rude, but out of compassion. There is so much wrong stuff on the internet, that if you waste your time fighting it, you’ll never discover what’s right.
Thank you. I really appreciate this clarification.
I meant God Is Great as a strong endorsement of LessWrong. I am aware that establishing an analogy with religion is often used to discredit ideas and movements, but one of the things I want to push back against is that this move is necessarily discrediting. But this requires a lot of work (historical background on how religions got to occupy the place they do today within culture, classical liberal political philosophy...) on my part to explain why I think so, and why in the case of EA/LW, I think the comparison is flattering and useful. This is work I haven’t done yet, and I might be wrong about how I view this, so I guess I shouldn’t have been too surprised about the negative reaction.
I really should have written and posted something about my heterodox background assumptions first, and gotten feedback on them, before I published something building on them.
So you’re claiming that religion (aka team green) is so bad and irrational that any analogy of rationalism (aka team blue) with it is dangerous and sabotage? Or that any positive talk of team green is a threat?
It seems to me that the LW (over)reaction to the irrationality of religion is pretty irrational and has nothing to do with ‘clarity’. If you’re rejecting apriori a line of inquiry because it seems threatening to the in-group, I don’t consider that “rational.”
Edit: This was an overly antagonistic response, in part due to an uncharitable reading of Vladimir_Nesov’s comment.
No, he is making a different and more precise claim.
There is a phenomenon that can be called “anti-epistemology.” This is a set of social forces that penalize or otherwise impede clear thought and speech.
Sometimes, a certain topic in a certain space is free of anti-epistemology. It is relatively easy to think about, research, and discuss it clearly. A central example would be the subject of linear algebra in the context of a class on linear algebra.
Other times, anti-epistemology makes thought, research, and discussion difficult for a certain topic in a certain context. A central example here would be the topic of market economics in a Maoist prison camp.
Unfortunately, what unites both of these cases is that they are so overt. The linear algebra class is set up with the explicit goal of supporting the study of linear algebra; the Maoist prison camp with the explicit purpose of suppressing market economics (among other ideas).
Less crushing than the Maoist prison camp, but still pernicious, are settings in which a certain topic is suppressed implicitly, by a semi-spoken or unspoken set of social monitoring, implicit loyalty tests, and softly enforced or reflexive analogies and framings. This is an anti-epistemology that we might encounter in our everyday lives. You may happen to be so blessed as to be entirely free of such social dynamics, or so incredibly brave as to be entirely free, or even blissfully unaware, that such pressures could exist. But for others, this is a routine struggle.
The claim that Vladimir Nesov is making is that the way such “soft suppressions” get set up in the first place is by establishing analogies. To that, I would add framings and choices of words. For example, if you wish to undermine support for market economics, start by referring to it as “capitalism.” If you wish to undermine support for new development, refer to it as “gentrification.” If you wish to undermine support for social justice, refer to it as “cancel culture.” If you’re an American who wishes to undermine support for the 2nd amendment, refer to action movies as “shoot ’em up movies.”
Then you can start with the analogies. In your case, if you wish to undermine rationality, you might start by making an analogy with religion. It’s a very common reflex. Social justice, capitalism, gun rights, sexual promiscuity, free speech, nationalism, and more have all been referred to many, many times as “religions” by their political opponents in one editorial after another.
Analogies aren’t inherently bad. They can be useful to pick out a particular feature of a confusing thing and make it familiar to a novice by comparison with a familiar thing. I am a biomedical engineering student. If I wanted to explain what a hydrogel is to you, I might say that it’s like snot. That’s an analogy, and it’s not anti-epistemology. I have a particular thing I want you to understand, and I choose a familiar example to help you get there.
But this has many properties in common with cherry-picking. You’re selecting a particular feature, taken out of context, and focusing attention on it. You’re asking for trust, putting yourself in a teaching role, conveying a picture of something with which your audience is unfamiliar to them, and putting a memory and association in their mind. For this reason, analogies can be effective ways to undermine support for a thing by making emotionally loaded but fundamentally lazy, misleading, narrow, or otherwise flawed analogies.
You are quite new to LessWrong, and will have to make up your own mind about the ideas you find here. Right now, you seem to be in a tense place—both viewing the site and its participants as irrationally religious in their denunciation of religion, and yet simultaneously feeling motivated to engage with it anyway.
My suggestion to you is to assess your own motives. Are you interested in what you find here, and do you think there’s a good chance that you are the one with something to learn? If so, then consider that when you find yourself tempted to reflexively dismiss or unfavorably analogize what you find here.
If not, then consider simply stopping your participation. I say this not to be rude, but out of compassion. There is so much wrong stuff on the internet, that if you waste your time fighting it, you’ll never discover what’s right.
Thank you. I really appreciate this clarification.
I meant God Is Great as a strong endorsement of LessWrong. I am aware that establishing an analogy with religion is often used to discredit ideas and movements, but one of the things I want to push back against is that this move is necessarily discrediting. But this requires a lot of work (historical background on how religions got to occupy the place they do today within culture, classical liberal political philosophy...) on my part to explain why I think so, and why in the case of EA/LW, I think the comparison is flattering and useful. This is work I haven’t done yet, and I might be wrong about how I view this, so I guess I shouldn’t have been too surprised about the negative reaction.
I really should have written and posted something about my heterodox background assumptions first, and gotten feedback on them, before I published something building on them.
Framing, research, and communication are all skills that take practice! I hope you’ll ultimately find this a helpful space to build your skills :)