It seems to me that Viliam’s complaint is not that there would be more to talk about, but that more talk would be politicized.
Why is that a problem?
I don’t know for sure whether it was (I don’t think I ever paid that much attention to the politics threads) but here’s one way it could have been: suppose LW has few but very vocal neoreactionaries[1] and that most of the non-neoreactionaries are not very interested in talking about neoreaction[2]. If those few neoreactionaries arrange that every political discussion is packed with NRx stuff, then those political discussions will be annoying to everyone else because in order to read the bits they’re interested in they have to wade through lots of NRx comments (and perhaps, though here they may have only themselves to blame, lots of anti-NRx responses).
[1] I think there is some evidence that this is actually so.
[2] This seems likely to be true, but I have no evidence. (I don’t mean that most non-NRx people want never to talk about NRx; only that for most the optimal amount of NRx discussion is rather small.)
When I see a thread that I don’t want to read, I don’t. It doesn’t cause me any problem.
What about when you see a thread that you would want to read, but in which a few people obsessed with things you find uninteresting have posted hundreds of comments you don’t want to read?
Of course it doesn’t need to be neoreactionaries doing this. It could be social-justice types seizing every possible opportunity to point out heteronormative kyriarchal phallogocentric subtexts. It could be people terrified about AI risk turning every discussion of computers doing interesting things into debates about whether We Are All Doomed—or people skeptical about AI risk complaining incessantly about how LW promotes paranoia about AI risk. It could be Christians proposing Jesus as the answer to every question, or atheists leaping on every case of suffering or successful scientific explanation to remind us that it’s evidence against God. Etc., etc., etc.
Wouldn’t that be a significant opportunity to get LessWrong?
It might be. Or it might be so only in the sense that for an alcoholic, having a glass of whisky is a significant opportunity to practice the discipline of self-control. (That is: in principle it might be but in practice the outcome might be almost certain to be bad.)
suppose LW has few but very vocal neoreactionaries[1] and that most of the non-neoreactionaries are not very interested in talking about neoreaction[2].
What do you mean by that? Do you mean that they’re not interested in becoming lesswrong about the issue or that they only want to become lesswrong to the extent it doesn’t involve being similar to those weird NRx’s?
Obviously I mean neither (btw: hi, Eugine!). I mean what I say: for whatever reason they are not very interested in talking about NRx here. Possible reasons other than your maximally-uncharitable ones:
They are just not very interested in the things neoreactionaries get excited about (race, gender, political structures—though it occurs to me that LW’s small but vocal NRx contingent appears to be much more interested in race and gender than in any of the other things theoretically characteristic of NRx).
Is that the same as “not interested in becoming less wrong”? No, it’s broader and typically indicative of a different state of mind. Contrast a hyperzealously closed-minded Christian missionary, who is extremely interested in his religion and not at all interested in becoming less wrong about it, with an apathetic agnostic, who just doesn’t give a damn about religion. Neither will be very interested in a presentation of the merits of Hinduism, but their attitudes are quite different. (It’s not clear that one is better than the other.)
They have already given the matter plenty of thought and done their best to get less wrong about it. At this point they find little value in going over it again and again.
They are interested in becoming less wrong about political structures, gender, race, etc., but NRx positions on these lie outside the range they find credible.
Is that the same as “only to the extent it doesn’t involve being similar to those weird NRx’s”? No, it’s about finding the ideas implausible rather than finding the people offputting. (Though of course the two may go together. If you find people offputting you may dismiss their ideas; if you find an idea repellent or crazy you may think ill of people who hold it.)
They have observed some discussions of NRx, seen that they consistently generate much more heat than light, and decided that whatever the facts of the matter an internet debate about it is likely to do more harm than good.
They have found that they find NRx advocates consistently unpleasant, and the benefits of possibly becoming less wrong don’t (for them) outweigh the cost of having an unpleasant argument.
They have found that they find NRx opponents consistently unpleasant, and (etc.).
Interesting theories, let’s see how they square with the evidence.
•They are just not very interested in the things neoreactionaries get excited about (race, gender, political structures—though it occurs to me that LW’s small but vocal NRx contingent appears to be much more interested in race and gender than in any of the other things theoretically characteristic of NRx).
On the other hand they are interested in questions where where race, gender, and political structures are relevant to the answers.
•They have already given the matter plenty of thought and done their best to get less wrong about it. At this point they find little value in going over it again and again.
•They are interested in becoming less wrong about political structures, gender, race, etc., but NRx positions on these lie outside the range they find credible.
If that was the case, one would expect them to be able to produce counter arguments to say the “NRx” (although it’s not unique to NRx) positions on race and gender. Instead the best they can do is link to SSC (which agrees that the NRx’s have a point in that respect), or say things that amount to saying how they don’t want to think about it.
•They have observed some discussions of NRx, seen that they consistently generate much more heat than light, and decided that whatever the facts of the matter an internet debate about it is likely to do more harm than good.
To the extent that’s true its not the “NRx” people generating the heat.
•They have found that they find NRx advocates consistently unpleasant, and the benefits of possibly becoming less wrong don’t (for them) outweigh the cost of having an unpleasant argument.
•They have found that they find NRx opponents consistently unpleasant, and (etc.).
These are just rephrasing of my hypothesis that they only want to become lesswrong to the extent it doesn’t involve being similar to those weird NRx’s. Good to hear you’re willing to agree with it.
On the other hand they are interested in questions where race, gender and political structures are relevant to the answers.
Maybe, though in some cases their opinion as to that relevance may reasonably differ from yours. But that doesn’t in any way mean that they should be interested in NRx. Consider the following parallel. I am making plans concerning the next 10 years of my life—whether to take a new job, move house, get married or divorced, etc. It is highly relevant to my deliberations whether some time in the next few years a vengeful god is going to step in and put an end to the world as we know it. That doesn’t mean that I shouldn’t be annoyed when my attempts to discuss the next few years are repeatedly interrupted by people wanting to warn me about the coming apocalypse.
one would expect them to be able to produce counterarguments
Yup. But one wouldn’t necessarily expect them to do it. (If I’m talking about the likely state of the world economy 5 years from now and some guy bursts in to tell me excitedly about how Cthulhu will have risen from the depths by then and started eating everyone, I am not going to waste my time telling him exactly why I don’t think Cthulhu is real and why I wouldn’t expect him to start eating people so soon even if he were.)
To the extent that’s true it’s not the “NRx” people generating the heat.
Heat arises from friction. It takes two to generate the friction. I’m not terribly interested deciding which of the sticks getting rubbed together is responsible for the flames.
These are just rephrasing of my hypothesis
No, they’re not. Your hypothesis is that these people want to avoid becoming like the NRx people; mine is that they want to avoid having to interact with the NRx people. (There might be some overlap. If someone thinks NRx people are unpleasant, they might avoid being convinced lest they become unpleasant themselves or find themselves spending more time around unpleasant people.)
I’m not, for the avoidance of doubt, claiming that your hypotheses are never correct. Just that they’re a very long way from exhausting the possibilities for why someone might not want to engage in a lot of argument about NRx, which is one reason why it is wrong to take the general statement I made and “explain” it as the more specific one you claimed was what I actually meant.
Consider the following parallel. I am making plans concerning the next 10 years of my life—whether to take a new job, move house, get married or divorced, etc. It is highly relevant to my deliberations whether some time in the next few years a vengeful god is going to step in and put an end to the world as we know it.
This is an example of these beliefs lying outside the range they find credible, which I addressed in the next point.
Yup. But one wouldn’t necessarily expect them to do it. (If I’m talking about the likely state of the world economy 5 years from now and some guy bursts in to tell me excitedly about how Cthulhu will have risen from the depths by then and started eating everyone, I am not going to waste my time telling him exactly why I don’t think Cthulhu is real and why I wouldn’t expect him to start eating people so soon even if he were.)
The difference is that the NRx’s (or at least the HBD-people) can present arguments for their beliefs, like the fact that things like race and gender, do in fact correlate with IQ, SAT scores, success in various professions, etc.
Heat arises from friction. It takes two to generate the friction. I’m not terribly interested deciding which of the sticks getting rubbed together is responsible for the flames.
You’re taking the metaphor too literally in an attempt to pretend to be wise. In this case “heat” means bad arguments or no arguments at all. One side presents arguments for its positions, the other side presents a variety of ever-shifting excuses for why the topic shouldn’t be brought up at all.
This is an example of those beliefs lying outside the range they find credible, which I addressed in the next point.
Sure. I was just making the point that you can’t get from “X could be relevant to Y, which Z finds important” to “Z should be interested in X”.
the NRx’s (or at least the HBD-people) can present arguments for their beliefs
I don’t know about actual literal Cthulhu-worshippers, if any there be, but the preachers of pending apocalypse have arguments for their beliefs too. And, again, I think you may be misunderstanding the point I was making, which is simply that you can’t get from “Z has good arguments against X” to “Z will present arguments against X whenever someone comes along proclaiming X”, and therefore you can’t get from “X came up and Z blew it off without presenting counterarguments” to “Z doesn’t have good arguments against X”.
in an attempt to
This is far from the first time that you have claimed to know my motives. I’m sorry to inform you that your track record on getting them right appears to me to be very poor.
In this case “heat” means [...]
It was I, not you, who made the more-heat-than-light metaphor in this case, and you don’t get to tell me what I meant by it. I did not, in fact, mean “bad arguments or no arguments at all”; I meant “rudeness and crossness and people getting upset at one another”.
As for taking it too literally: no, I am observing that the metaphor happens to correspond to reality in a possibly-unexpected way. “Heat” in an argument really does come from “friction” between people, from them “rubbing one another up the wrong way”.
(Incidentally, it feels very odd to be criticized for doing that by an admirer of Chesterton, who did the same thing all the time. (More stylishly than me, no doubt, but if writing as well as Chesterton were a requirement for participation here it would be a quiet place indeed.)
why the topic shouldn’t be brought up at all
I think the problem many people have isn’t that it’s “brought up at all” but that some of those who want to talk about NRx and HBD seem to want to talk about those things all the time. That may mean that the only actually-achievable options are (1) a strict “no talking about this stuff” policy and (2) having every discussion to which race, gender, drawbacks of democracy, etc., could possibly be relevant being full of (neo-)reactionary stuff. Those both seem like bad outcomes, and if we end up with bad outcome #1 I wouldn’t want to blame whoever chooses #1 over #2 for its badness, because #2 is bad too.
It was I, not you, who made the more-heat-than-light metaphor in this case, and you don’t get to tell me what I meant by it.
Yes, I have a habit of assuming the most sensible interpretation of what my interlocutor says, it appears to be a bad habit with some people.
I meant “rudeness and crossness and people getting upset at one another”.
Ok, plugging that definition into your argument, and removing the metaphor, your argument appears to come down to “arguing ‘NRx-type’ positions gets makes my side upset therefore the ‘NRx’ side should stop doing it”.
That is pretty much the reverse of what you have been doing.
I think your actual habit is of assuming the interpretation that makes most sense to you. Unfortunately that isn’t the same, and in particular it gives very wrong results when your mental model of your interlocutors is very inaccurate.
your argument appears to come down to “arguing ‘NRx-type’ positions gets makes my side upset therefore the ‘NRx’ side should stop doing it”.
Not quite. (Though, as entirelyuseless says, that wouldn’t in fact be such a bad argument.) Here’s a link to where I came in; as you can see, I was explaining how having NRx discussions tend to proliferate could be a problem. My answer was that I didn’t know whether it actually is, but it could be so in a situation where (1) there are very few NRx’s (but vocal enough to have a lot of impact) and (2) most of the other people aren’t interested in NRx discussions. And then we got into a lengthy discussion of why #2 might be; rudeness-and-crossness was one of many possibilities.
So the argument is: in this hypothetical situation that may or may not be actual, most LWers don’t want to have a lot of NRx discussions. One of the many possible reasons is (as you put it) that these arguments get their side upset. Since (in this hypothetical situation) most LWers don’t want these discussions, and very few actively do want them, LWers as a whole would be happier without them.
(Although I’ve adopted your spin-laden language in the paragraph above, I would like to point out that it’s actually quite far from what I meant. My hypothetical person-who-doesn’t-want-to-talk-about-NRx is concerned not only that his allies might get upset, but also that his opponents might; and that the result of all this getting-upset on both sides is likely to be that no one learns much from anyone else. That’s why the metaphor is “more heat than light” and not just “lots of heat”.)
Assuming that was his argument, it seems like a pretty good one. You do not persuade people by making them upset. You make them more convinced than ever of their original position.
So, being “less wrong” is measured by “how much time one spends debating neoreaction”? If you refuse to keep endlessly debating neoreaction, you are closed-minded. Don’t worry about evidence; the signalling is cool!
It seems to me that Viliam’s complaint is not that there would be more to talk about, but that more talk would be politicized.
I don’t know for sure whether it was (I don’t think I ever paid that much attention to the politics threads) but here’s one way it could have been: suppose LW has few but very vocal neoreactionaries[1] and that most of the non-neoreactionaries are not very interested in talking about neoreaction[2]. If those few neoreactionaries arrange that every political discussion is packed with NRx stuff, then those political discussions will be annoying to everyone else because in order to read the bits they’re interested in they have to wade through lots of NRx comments (and perhaps, though here they may have only themselves to blame, lots of anti-NRx responses).
[1] I think there is some evidence that this is actually so.
[2] This seems likely to be true, but I have no evidence. (I don’t mean that most non-NRx people want never to talk about NRx; only that for most the optimal amount of NRx discussion is rather small.)
What about when you see a thread that you would want to read, but in which a few people obsessed with things you find uninteresting have posted hundreds of comments you don’t want to read?
Of course it doesn’t need to be neoreactionaries doing this. It could be social-justice types seizing every possible opportunity to point out heteronormative kyriarchal phallogocentric subtexts. It could be people terrified about AI risk turning every discussion of computers doing interesting things into debates about whether We Are All Doomed—or people skeptical about AI risk complaining incessantly about how LW promotes paranoia about AI risk. It could be Christians proposing Jesus as the answer to every question, or atheists leaping on every case of suffering or successful scientific explanation to remind us that it’s evidence against God. Etc., etc., etc.
It might be. Or it might be so only in the sense that for an alcoholic, having a glass of whisky is a significant opportunity to practice the discipline of self-control. (That is: in principle it might be but in practice the outcome might be almost certain to be bad.)
What do you mean by that? Do you mean that they’re not interested in becoming lesswrong about the issue or that they only want to become lesswrong to the extent it doesn’t involve being similar to those weird NRx’s?
Obviously I mean neither (btw: hi, Eugine!). I mean what I say: for whatever reason they are not very interested in talking about NRx here. Possible reasons other than your maximally-uncharitable ones:
They are just not very interested in the things neoreactionaries get excited about (race, gender, political structures—though it occurs to me that LW’s small but vocal NRx contingent appears to be much more interested in race and gender than in any of the other things theoretically characteristic of NRx).
Is that the same as “not interested in becoming less wrong”? No, it’s broader and typically indicative of a different state of mind. Contrast a hyperzealously closed-minded Christian missionary, who is extremely interested in his religion and not at all interested in becoming less wrong about it, with an apathetic agnostic, who just doesn’t give a damn about religion. Neither will be very interested in a presentation of the merits of Hinduism, but their attitudes are quite different. (It’s not clear that one is better than the other.)
They have already given the matter plenty of thought and done their best to get less wrong about it. At this point they find little value in going over it again and again.
They are interested in becoming less wrong about political structures, gender, race, etc., but NRx positions on these lie outside the range they find credible.
Is that the same as “only to the extent it doesn’t involve being similar to those weird NRx’s”? No, it’s about finding the ideas implausible rather than finding the people offputting. (Though of course the two may go together. If you find people offputting you may dismiss their ideas; if you find an idea repellent or crazy you may think ill of people who hold it.)
They have observed some discussions of NRx, seen that they consistently generate much more heat than light, and decided that whatever the facts of the matter an internet debate about it is likely to do more harm than good.
They have found that they find NRx advocates consistently unpleasant, and the benefits of possibly becoming less wrong don’t (for them) outweigh the cost of having an unpleasant argument.
They have found that they find NRx opponents consistently unpleasant, and (etc.).
Interesting theories, let’s see how they square with the evidence.
On the other hand they are interested in questions where where race, gender, and political structures are relevant to the answers.
If that was the case, one would expect them to be able to produce counter arguments to say the “NRx” (although it’s not unique to NRx) positions on race and gender. Instead the best they can do is link to SSC (which agrees that the NRx’s have a point in that respect), or say things that amount to saying how they don’t want to think about it.
To the extent that’s true its not the “NRx” people generating the heat.
These are just rephrasing of my hypothesis that they only want to become lesswrong to the extent it doesn’t involve being similar to those weird NRx’s. Good to hear you’re willing to agree with it.
Maybe, though in some cases their opinion as to that relevance may reasonably differ from yours. But that doesn’t in any way mean that they should be interested in NRx. Consider the following parallel. I am making plans concerning the next 10 years of my life—whether to take a new job, move house, get married or divorced, etc. It is highly relevant to my deliberations whether some time in the next few years a vengeful god is going to step in and put an end to the world as we know it. That doesn’t mean that I shouldn’t be annoyed when my attempts to discuss the next few years are repeatedly interrupted by people wanting to warn me about the coming apocalypse.
Yup. But one wouldn’t necessarily expect them to do it. (If I’m talking about the likely state of the world economy 5 years from now and some guy bursts in to tell me excitedly about how Cthulhu will have risen from the depths by then and started eating everyone, I am not going to waste my time telling him exactly why I don’t think Cthulhu is real and why I wouldn’t expect him to start eating people so soon even if he were.)
Heat arises from friction. It takes two to generate the friction. I’m not terribly interested deciding which of the sticks getting rubbed together is responsible for the flames.
No, they’re not. Your hypothesis is that these people want to avoid becoming like the NRx people; mine is that they want to avoid having to interact with the NRx people. (There might be some overlap. If someone thinks NRx people are unpleasant, they might avoid being convinced lest they become unpleasant themselves or find themselves spending more time around unpleasant people.)
I’m not, for the avoidance of doubt, claiming that your hypotheses are never correct. Just that they’re a very long way from exhausting the possibilities for why someone might not want to engage in a lot of argument about NRx, which is one reason why it is wrong to take the general statement I made and “explain” it as the more specific one you claimed was what I actually meant.
This is an example of these beliefs lying outside the range they find credible, which I addressed in the next point.
The difference is that the NRx’s (or at least the HBD-people) can present arguments for their beliefs, like the fact that things like race and gender, do in fact correlate with IQ, SAT scores, success in various professions, etc.
You’re taking the metaphor too literally in an attempt to pretend to be wise. In this case “heat” means bad arguments or no arguments at all. One side presents arguments for its positions, the other side presents a variety of ever-shifting excuses for why the topic shouldn’t be brought up at all.
Sure. I was just making the point that you can’t get from “X could be relevant to Y, which Z finds important” to “Z should be interested in X”.
I don’t know about actual literal Cthulhu-worshippers, if any there be, but the preachers of pending apocalypse have arguments for their beliefs too. And, again, I think you may be misunderstanding the point I was making, which is simply that you can’t get from “Z has good arguments against X” to “Z will present arguments against X whenever someone comes along proclaiming X”, and therefore you can’t get from “X came up and Z blew it off without presenting counterarguments” to “Z doesn’t have good arguments against X”.
This is far from the first time that you have claimed to know my motives. I’m sorry to inform you that your track record on getting them right appears to me to be very poor.
It was I, not you, who made the more-heat-than-light metaphor in this case, and you don’t get to tell me what I meant by it. I did not, in fact, mean “bad arguments or no arguments at all”; I meant “rudeness and crossness and people getting upset at one another”.
As for taking it too literally: no, I am observing that the metaphor happens to correspond to reality in a possibly-unexpected way. “Heat” in an argument really does come from “friction” between people, from them “rubbing one another up the wrong way”.
(Incidentally, it feels very odd to be criticized for doing that by an admirer of Chesterton, who did the same thing all the time. (More stylishly than me, no doubt, but if writing as well as Chesterton were a requirement for participation here it would be a quiet place indeed.)
I think the problem many people have isn’t that it’s “brought up at all” but that some of those who want to talk about NRx and HBD seem to want to talk about those things all the time. That may mean that the only actually-achievable options are (1) a strict “no talking about this stuff” policy and (2) having every discussion to which race, gender, drawbacks of democracy, etc., could possibly be relevant being full of (neo-)reactionary stuff. Those both seem like bad outcomes, and if we end up with bad outcome #1 I wouldn’t want to blame whoever chooses #1 over #2 for its badness, because #2 is bad too.
Yes, I have a habit of assuming the most sensible interpretation of what my interlocutor says, it appears to be a bad habit with some people.
Ok, plugging that definition into your argument, and removing the metaphor, your argument appears to come down to “arguing ‘NRx-type’ positions gets makes my side upset therefore the ‘NRx’ side should stop doing it”.
That is pretty much the reverse of what you have been doing.
I think your actual habit is of assuming the interpretation that makes most sense to you. Unfortunately that isn’t the same, and in particular it gives very wrong results when your mental model of your interlocutors is very inaccurate.
Not quite. (Though, as entirelyuseless says, that wouldn’t in fact be such a bad argument.) Here’s a link to where I came in; as you can see, I was explaining how having NRx discussions tend to proliferate could be a problem. My answer was that I didn’t know whether it actually is, but it could be so in a situation where (1) there are very few NRx’s (but vocal enough to have a lot of impact) and (2) most of the other people aren’t interested in NRx discussions. And then we got into a lengthy discussion of why #2 might be; rudeness-and-crossness was one of many possibilities.
So the argument is: in this hypothetical situation that may or may not be actual, most LWers don’t want to have a lot of NRx discussions. One of the many possible reasons is (as you put it) that these arguments get their side upset. Since (in this hypothetical situation) most LWers don’t want these discussions, and very few actively do want them, LWers as a whole would be happier without them.
(Although I’ve adopted your spin-laden language in the paragraph above, I would like to point out that it’s actually quite far from what I meant. My hypothetical person-who-doesn’t-want-to-talk-about-NRx is concerned not only that his allies might get upset, but also that his opponents might; and that the result of all this getting-upset on both sides is likely to be that no one learns much from anyone else. That’s why the metaphor is “more heat than light” and not just “lots of heat”.)
Assuming that was his argument, it seems like a pretty good one. You do not persuade people by making them upset. You make them more convinced than ever of their original position.
So, being “less wrong” is measured by “how much time one spends debating neoreaction”? If you refuse to keep endlessly debating neoreaction, you are closed-minded. Don’t worry about evidence; the signalling is cool!