Whether or not the lawful-goods of the world like Yvain are right, they are common. There are tons of people who want to side with good causes, but who are repulsed by the dark side even when used in favor of those causes. Maybe they aren’t playing to win, but you don’t play to win by saying you hate them for for following their lawful code.
For many people, the lawful code of “I’m siding with the truth” comes before the good code of “I’m going to press whatever issue.” When these people see a movement playing dirty, advocating arguments as soldiers, where you decide whether to argue against it based on whether it’s for your side rather than whether it’s a good argument, getting mad at people for pointing out bad arguments from their side, they begin to suspect that your side is not the “Side of Truth”. So you lose potential recruits. And the real Sith lords, not the ones who are trying to use the dark side for good, will have much less trouble hijacking your movement with the lawful-goods and their annoying code and the social standards they impose gone.
Leaving aside the honor among foes idea, and the “what if you’re really the villain” idea, if your cause is really just, then although the lawful-goods are less effective than you, their existence is good for you. Not everything they do is good, but on balance they are a positive influence. You’re not going to convince them to attempt to be dark side users for good like you are attempting to be, so stop giving them reasons to dislike you.
Even if you can convince them, the lawful-evils who think they are lawful-goods are listening to your arguments. Most people think they are good. It is hard to tell when you’re not good. So the idea that only truly good people are bound by the lawful code is crazy. Lots of lawful evil is an unintentional corruption of lawful good, and this corruption doesn’t unilaterally affect your goodness and your lawfulness. They could tell (or at least convince themselves) they weren’t really good, if they didn’t follow the lawful code, because they think like lawful good people in that respect. The lawful evil people who see you, and know you are opposed to them on the good/evil axis, think they see evil people saying “Forget this honor among enemies thing. We have no honor. Watch me put on this ‘I am defectbot’ shirt”. And that is a much stronger argument to abandon the lawful code of rational argument and become the much more dangerous chaotic evil than what the lawful-goods hear, which is their chaotic good allies telling them to defect.
But in real modern human politics, it’s more complicated because although there is one lawful/chaotic axis, there are many
good/evil axes. Because there are many separate issues that people can get right or wrong. Arthur Chu thinks that the issue of overriding importance is social justice. So he demands that we drop all cooperation with people who are evil on that axis. He says we aren’t playing to win. I can think of 3 issues (2 of them are actually broad categories of issues) that I am confident are more important than social justice, and which are easier to improve than the problems social justice wants to counter. In order of decreasing importance, existential risk, near-term animal suffering including factory farming and wild animals, and mortality/aging.
In real life, you don’t demand that your allies be on the same end of every good/evil axis as you. That is not playing to win. A better strategy (and the one Chu is employing) is to pick the most important axis, and try and form a coalition based on that axis. Chu accuses LW of not playing to win, well, I’m just not playing to win along the social justice axis at the cost of everything else. I think different axes are more important.
And there’s also the fact that for some causes, “lawful” people (people who play by the rules of rational discourse) are much better to have as allies. If we use bad statistics and dark arts to convince the masses to fund FAI research, they may as well fund Novamente as MIRI. Not all causes can benefit from irrational masses. Something like MIRI can’t afford to even take one step down the path to the dark side. When you want to convince academics and experts of your cause, they will smell the dark arts on you and conclude you are a cult. And with the people you will attract by using dark arts, your organization will soon become one. The kind of people who you absolutely need to do object-level work for you are the kind of people who will never join you if you use the dark arts.
If you take a pluralistic “which axes are important” approach instead of the one that Chu takes, then there is a lot to be said for lawfulness, because it tends to promote goodness*, a little. And when get a bunch of lawful-goods and lawful-evils together and you nudge them all a little toward good through rational discussion (on different axes), that is pretty valuable. Because almost everyone is evil on at least one axis. And such a community needs a policy like “we ask that you be lawful,[follow standards of rational discourse] not that you are good [have gotten object-level questions of policy right],” because it is the only defensible Schelling point.
*If you haven’t caught on to how I’m using “law vs chaos” and “good vs evil” axes here by now, this may sound like moral realism, but when I mean by “law” is upholding Yvain-style standards of discourse. What I mean by “good” is not just being moral, but being moral and right given that morality about questions of ethics.
According to the vast majority of social just types, you have just signaled yourself as a quite serious enemy. At the very least you would get banned from any site you posted this on. At worst you would probably be blackballed and quite roundly thrashed in social media.
Given your argument of how they should interact with “lawful goods”, and please taboo your applause lights of “dark arts” and “dnd alignments in general”, you are unironically making a much larger mistake wrt to them. That is, this post will put them off towards you, and possibly people who interact positively with you, far more than Arthur’s posts would put Yvain off.
Can you explain the difference here? Why is it rational for you to make this post but not for Arthur to make his?
For reference, saying that existential risk and animal cruelty outweigh social justice is going to be extremely offensive to them. I’m not sure I could state how much mortality/aging, especially in LW terms, being more important than social justice would make them hate you without getting banned, even on a site that pride’s itself on free and open discourse. Well, I suppose i could try but I would probably fail.
I wouldn’t react the same way, but I also wouldn’t fault them for their reaction.
According to the vast majority of social just types, you have just signaled yourself as a quite serious enemy. At the very least you would get banned from any site you posted this on. At worst you would probably be blackballed and quite roundly thrashed in social media.
Is this supposed to be an argument against Mestroyer or against the Social Justice Types?
Its an inconsistency in Mestroyer’s logic. He talks about how social justice types could attract “lawful good” people. But his response, typical of a certain class of rationalist is so offensive to social justice types that one wonders whether it makes sense to attract a group who places social justice so low on their to-do list. It seems to me that it would be counter productive to integrate people such as Mestroyer into the social justice community.
Disagree. Social justice is a set of dozens of axes and deals with issues like the prison-industrial complex. But somehow people caught up in that are being unreasonable when Mestroyer says that existential risk is more important and they take offense? That’s ridiculous.
The net harm done by any number of social justice issues far outweighs the issues Mestroyer considers important based on his comment.
Are you arguing that its merely the intensity of the response that makes them mindkilled?
Are you arguing that its merely the intensity of the response that makes them mindkilled?
The intensity of the response is what makes them mindkilled. Of course, if the social justice people were actually willing to listen to people who disagreed with them, they might realize that Mestroyer is in fact correct about existential risk being more important that their issues.
Edit: In fact, if they listed to more criticism, they might realize that the net harm from most of their issues is at worst negligible and at best negative, i.e., it is the social justice movement itself that is doing net harm.
It might be more important to white upper middle class rationalists, especially in somewhere like the Bay Area. Can’t argue with that.
You’d be hard pressed to convince me that cryonics is more beneficial to people of color than dismantling the systematic bias against people of color inherent in western society. Most existential risks are similarly unconvincing.
You’d be hard pressed to convince me that cryonics is more beneficial to people of color than dismantling the systematic bias against people of color inherent in western society.
Well, there is a very simple reason cryonics is more beneficial than “dismantling the systematic bias against people of color inherent in western society”, namely, that the “systematic bias against people of color inherent in western society” doesn’t actually exist. If anything modern western society has a systematic bias in favor of people of color.
Here’s a hint: the people who told you that there exists “a systematic bias against people of color inherent in western society” believe that lying is justified for the cause, and they were either lying to you or repeating someone else’s lie.
You’re really serious aren’t you? Affirmative action? That’s your argument? And I thought this was a rationalist website. You probably think there isn’t a systematic bias against women either.
I’m tapping out, now that you’ve revealed your true nature.
You’re really serious aren’t you? Affirmative action? That’s your argument?
Yes, and I notice a distinct lack of counter-argument on your part.
And I thought this was a rationalist website.
Yes, and that means we are expected to provide arguments for our claims here.
You probably think there isn’t a systematic bias against women either.
There isn’t.
now that you’ve revealed your true nature.
From my experience, that’s social-justice-speak for “I don’t actually have any rational arguments against your position so I’m going to resort to name calling”.
Whether or not the lawful-goods of the world like Yvain are right, they are common. There are tons of people who want to side with good causes, but who are repulsed by the dark side even when used in favor of those causes. Maybe they aren’t playing to win, but you don’t play to win by saying you hate them for for following their lawful code.
For many people, the lawful code of “I’m siding with the truth” comes before the good code of “I’m going to press whatever issue.” When these people see a movement playing dirty, advocating arguments as soldiers, where you decide whether to argue against it based on whether it’s for your side rather than whether it’s a good argument, getting mad at people for pointing out bad arguments from their side, they begin to suspect that your side is not the “Side of Truth”. So you lose potential recruits. And the real Sith lords, not the ones who are trying to use the dark side for good, will have much less trouble hijacking your movement with the lawful-goods and their annoying code and the social standards they impose gone.
Leaving aside the honor among foes idea, and the “what if you’re really the villain” idea, if your cause is really just, then although the lawful-goods are less effective than you, their existence is good for you. Not everything they do is good, but on balance they are a positive influence. You’re not going to convince them to attempt to be dark side users for good like you are attempting to be, so stop giving them reasons to dislike you.
Even if you can convince them, the lawful-evils who think they are lawful-goods are listening to your arguments. Most people think they are good. It is hard to tell when you’re not good. So the idea that only truly good people are bound by the lawful code is crazy. Lots of lawful evil is an unintentional corruption of lawful good, and this corruption doesn’t unilaterally affect your goodness and your lawfulness. They could tell (or at least convince themselves) they weren’t really good, if they didn’t follow the lawful code, because they think like lawful good people in that respect. The lawful evil people who see you, and know you are opposed to them on the good/evil axis, think they see evil people saying “Forget this honor among enemies thing. We have no honor. Watch me put on this ‘I am defectbot’ shirt”. And that is a much stronger argument to abandon the lawful code of rational argument and become the much more dangerous chaotic evil than what the lawful-goods hear, which is their chaotic good allies telling them to defect.
But in real modern human politics, it’s more complicated because although there is one lawful/chaotic axis, there are many good/evil axes. Because there are many separate issues that people can get right or wrong. Arthur Chu thinks that the issue of overriding importance is social justice. So he demands that we drop all cooperation with people who are evil on that axis. He says we aren’t playing to win. I can think of 3 issues (2 of them are actually broad categories of issues) that I am confident are more important than social justice, and which are easier to improve than the problems social justice wants to counter. In order of decreasing importance, existential risk, near-term animal suffering including factory farming and wild animals, and mortality/aging.
In real life, you don’t demand that your allies be on the same end of every good/evil axis as you. That is not playing to win. A better strategy (and the one Chu is employing) is to pick the most important axis, and try and form a coalition based on that axis. Chu accuses LW of not playing to win, well, I’m just not playing to win along the social justice axis at the cost of everything else. I think different axes are more important.
And there’s also the fact that for some causes, “lawful” people (people who play by the rules of rational discourse) are much better to have as allies. If we use bad statistics and dark arts to convince the masses to fund FAI research, they may as well fund Novamente as MIRI. Not all causes can benefit from irrational masses. Something like MIRI can’t afford to even take one step down the path to the dark side. When you want to convince academics and experts of your cause, they will smell the dark arts on you and conclude you are a cult. And with the people you will attract by using dark arts, your organization will soon become one. The kind of people who you absolutely need to do object-level work for you are the kind of people who will never join you if you use the dark arts.
If you take a pluralistic “which axes are important” approach instead of the one that Chu takes, then there is a lot to be said for lawfulness, because it tends to promote goodness*, a little. And when get a bunch of lawful-goods and lawful-evils together and you nudge them all a little toward good through rational discussion (on different axes), that is pretty valuable. Because almost everyone is evil on at least one axis. And such a community needs a policy like “we ask that you be lawful,[follow standards of rational discourse] not that you are good [have gotten object-level questions of policy right],” because it is the only defensible Schelling point.
*If you haven’t caught on to how I’m using “law vs chaos” and “good vs evil” axes here by now, this may sound like moral realism, but when I mean by “law” is upholding Yvain-style standards of discourse. What I mean by “good” is not just being moral, but being moral and right given that morality about questions of ethics.
Could you post a screenshot or archived version of your Facebook link?
According to the vast majority of social just types, you have just signaled yourself as a quite serious enemy. At the very least you would get banned from any site you posted this on. At worst you would probably be blackballed and quite roundly thrashed in social media.
Given your argument of how they should interact with “lawful goods”, and please taboo your applause lights of “dark arts” and “dnd alignments in general”, you are unironically making a much larger mistake wrt to them. That is, this post will put them off towards you, and possibly people who interact positively with you, far more than Arthur’s posts would put Yvain off.
Can you explain the difference here? Why is it rational for you to make this post but not for Arthur to make his?
For reference, saying that existential risk and animal cruelty outweigh social justice is going to be extremely offensive to them. I’m not sure I could state how much mortality/aging, especially in LW terms, being more important than social justice would make them hate you without getting banned, even on a site that pride’s itself on free and open discourse. Well, I suppose i could try but I would probably fail.
I wouldn’t react the same way, but I also wouldn’t fault them for their reaction.
Is this supposed to be an argument against Mestroyer or against the Social Justice Types?
Its an inconsistency in Mestroyer’s logic. He talks about how social justice types could attract “lawful good” people. But his response, typical of a certain class of rationalist is so offensive to social justice types that one wonders whether it makes sense to attract a group who places social justice so low on their to-do list. It seems to me that it would be counter productive to integrate people such as Mestroyer into the social justice community.
This is a problem with the social justice types being too mindkilled, not a problem with Mestroyer’s logic.
Disagree. Social justice is a set of dozens of axes and deals with issues like the prison-industrial complex. But somehow people caught up in that are being unreasonable when Mestroyer says that existential risk is more important and they take offense? That’s ridiculous.
The net harm done by any number of social justice issues far outweighs the issues Mestroyer considers important based on his comment.
Are you arguing that its merely the intensity of the response that makes them mindkilled?
The intensity of the response is what makes them mindkilled. Of course, if the social justice people were actually willing to listen to people who disagreed with them, they might realize that Mestroyer is in fact correct about existential risk being more important that their issues.
Edit: In fact, if they listed to more criticism, they might realize that the net harm from most of their issues is at worst negligible and at best negative, i.e., it is the social justice movement itself that is doing net harm.
No its not?
It might be more important to white upper middle class rationalists, especially in somewhere like the Bay Area. Can’t argue with that.
You’d be hard pressed to convince me that cryonics is more beneficial to people of color than dismantling the systematic bias against people of color inherent in western society. Most existential risks are similarly unconvincing.
Well, there is a very simple reason cryonics is more beneficial than “dismantling the systematic bias against people of color inherent in western society”, namely, that the “systematic bias against people of color inherent in western society” doesn’t actually exist. If anything modern western society has a systematic bias in favor of people of color.
Here’s a hint: the people who told you that there exists “a systematic bias against people of color inherent in western society” believe that lying is justified for the cause, and they were either lying to you or repeating someone else’s lie.
You’re really serious aren’t you? Affirmative action? That’s your argument? And I thought this was a rationalist website. You probably think there isn’t a systematic bias against women either.
I’m tapping out, now that you’ve revealed your true nature.
Yes, and I notice a distinct lack of counter-argument on your part.
Yes, and that means we are expected to provide arguments for our claims here.
There isn’t.
From my experience, that’s social-justice-speak for “I don’t actually have any rational arguments against your position so I’m going to resort to name calling”.