That line was somewhat tongue-in-cheek. I wouldn’t go that far over the top in a real discussion, although I might throw in a bit of anti-*ist rhetoric as an expected shibboleth.
That being said, these people aren’t stupid. They don’t generally have the same priorities or epistemology that we do, and they’re very political, but that’s true of a lot of people outside the gates of our incestuous little nerd-ghetto. Winning, in the real world, implies dealing with these people, and that’s likely to go a lot better if we understand them.
Does that mean we should go out and pick fights with mainstream social justice advocates? No, of course not. But putting ourselves in their shoes every now and then can’t hurt.
This makes some sense. I think part of the reason my contribution was taken so badly was, as I said, that I was arguing in a style that was clearly different to that of the rest of those present, and as such I was (in Villam Bur’s phrasing) pattern-matched as a bad guy. (In other words, I didn’t use the shibboleths.)
Significantly, no-one seemed to take issue with the actual thrust of my point.
“These people” are not homogenous and there are a lot of idiots among them. However what most of them are is mindkilled. They won’t update so why bother?
However what most of them are is mindkilled. They won’t update so why bother?
Because we occasionally might want to convince them of things, and we can’t do that without understanding what they want to see in an argument. Or, more generally, because it behooves us to get better at modeling people that don’t share our epistemology or our (at least, my) contempt for politics.
Because we occasionally might want to convince them of things, and we can’t do that without understanding what they want to see in an argument.
So, um, if you really let Jesus into your heart and accept Him as your personal savior you will see that He wants you to donate 50% of your salary to GiveWell’s top charities..?
it behooves us to get better at modeling people that don’t share our epistemology or our (at least, my) contempt for politics.
True, but you don’t do that by mimicking their rhetoric.
True, but you don’t do that by mimicking their rhetoric.
The point isn’t to blindly mimic their rhetoric, it’s to talk their language: not just the soundbites, but the motivations under them. To use your example, talking about letting Jesus into your heart isn’t going to convince anyone to donate a large chunk of their salary to GiveWell’s top charities. There’s a Christian argument for charity already, though, and talking effective altruism in those terms might well convince someone that accepts it to donate to real charity rather than some godawful sad puppies fund; or to support or create Christian charities that use EA methodology, which given comparative advantage might be even better. But you’re not going to get there without understanding what makes Christian charity tick, and it’s not the simple utilitarian arguments that we’re used to in an EA context.
The point isn’t to mimic their rhetoric, it’s to talk their language
There is a price: to talk in their language is to accept their framework. If you are making an argument in terms of fighting the oppression of white male patriarchy, you implicitly agree that the white male patriarchy is in the business of oppression and needs to be fought. If you’re using the Christian argument for charity to talk effective altruism, you are implicitly accepting the authority of Jesus.
If you’re using the Christian argument for charity to talk effective altruism, you are implicitly accepting the authority of Jesus.
Yes, you are. That’s a price you need to pay if you want to get something out of mindkilled people, which incidentally tends to be the first step in introducing outside ideas and thereby making them less mindkilled. Reject it in favor of some kind of radical honesty policy, and unless you’re very lucky and very charismatic you’ll find yourself with no allies and few friends. But hey, you’ll have the moral high ground! I hear that and $1.50 will get you a cup of coffee.
(My argument in the ancestor wasn’t really about fighting the white male patriarchy, though; the rhetoric about that is just gingerbread, like appending “peace be upon him” to the name of the Prophet. It’s about the importance of subjective experience and a more general contrarianism—which are also SJ themes, just less obvious ones.)
That’s a price you need to pay if you want to get something out of mindkilled people, which incidentally tends to be the first step in making them less mindkilled.
Maybe it’s the price you need to pay, but I don’t see how being able to get something out of mindkilled people is the first step in making them less mindkilled. You got what you wanted and paid for it by reinforcing their beliefs—why would they become more likely to change them?
some kind of radical honesty policy
I am not going for radical honesty. What I’m suspicious of is using arguments which you yourself believe are bullshit and at the same time pretending to be a bona fide member of a tribe to which you don’t belong.
And, by the way, there seems to be a difference between Jesus and SJ here. When talking to a Christian I can be “radically honest” and say something along the lines “I myself am not a Christian but you are and don’t you recall how Jesus said that …”. But that doesn’t work with SJWs—if I start by saying “I myself don’t believe in while male oppression but you do and therefore you should conclude that...”, I will be immediately crucified for the first part and no one will pay any attention to the second.
I don’t see how being able to get something out of mindkilled people is the first step in making them less mindkilled. You got what you wanted and paid for it by reinforcing their beliefs—why would they become more likely to change them?
You’re not substantially reinforcing their beliefs. Beliefs entangled with your identity don’t follow Bayesian rules: directly showing anything less than overpoweringly strong evidence against them (and even that isn’t a sure thing) tends to reinforce them by provoking rationalization, while accepting them is noise. If you don’t like Christianity, you wouldn’t want to use the Christian argument for charity with a weak or undecided Christian; but they aren’t going to be mindkilled in this regard, so it wouldn’t make a good argument anyway.
On the other hand, sneaking new ideas into someone’s internal memetic ecosystem tends to put stress on any totalizing identities they’ve adopted. For example, you might have to invoke God’s commandment to love thy neighbor as thyself to get a fundamentalist Christian to buy EA in the first place; but now they have an interest in EA, which could (e.g.) lead them to EA forums sharing secular humanist assumptions. Before, they’d have dismissed this as (e.g.) some kind of pathetic atheist attempt at constructing a morality in the absence of God. But now they have a shared assumption, a point of commonality. That’ll lead to cognitive dissonance, but only in the long run—timescales you can’t work on unless you’re very good friends with this person.
That cognitive dissonance won’t always resolve against Christianity, but sometimes it will. And when it doesn’t, you’ll usually still have left them with a more nuanced and less stereotypical Christianity.
You’re not substantially reinforcing their beliefs.
Well, yes, if we’re talking about a single conversation, especially over the ’net, you are not going to affect much anything. Still, even if you do not reinforce then you confirm. And there are different ways to get mindkilled, entangling your identity with beliefs is only one of them...
On the other hand, sneaking new ideas into someone’s internal memetic ecosystem tends to put stress on any totalizing identities they’ve adopted.
True, but the same caveat applies—if we’re talking about one or two conversations you’re not going to produce much if any effect.
In any case, my line of thinking in this subthread wasn’t concerned so much with the effectiveness of deconversion, but rather was more about the willingness to employ arguments that you don’t believe but your discussion opponent might. I understand the need to talk to people in the language they understand, but there is a fine line to walk here.
I don’t think reinforcing stupidity is a good idea.
“Never argue with stupid people, they will drag you down to their level and then beat you with experience.” ― Mark Twain
This is that level:
That line was somewhat tongue-in-cheek. I wouldn’t go that far over the top in a real discussion, although I might throw in a bit of anti-*ist rhetoric as an expected shibboleth.
That being said, these people aren’t stupid. They don’t generally have the same priorities or epistemology that we do, and they’re very political, but that’s true of a lot of people outside the gates of our incestuous little nerd-ghetto. Winning, in the real world, implies dealing with these people, and that’s likely to go a lot better if we understand them.
Does that mean we should go out and pick fights with mainstream social justice advocates? No, of course not. But putting ourselves in their shoes every now and then can’t hurt.
This makes some sense. I think part of the reason my contribution was taken so badly was, as I said, that I was arguing in a style that was clearly different to that of the rest of those present, and as such I was (in Villam Bur’s phrasing) pattern-matched as a bad guy. (In other words, I didn’t use the shibboleths.)
Significantly, no-one seemed to take issue with the actual thrust of my point.
Of course, but only somewhat :-)
“These people” are not homogenous and there are a lot of idiots among them. However what most of them are is mindkilled. They won’t update so why bother?
Because we occasionally might want to convince them of things, and we can’t do that without understanding what they want to see in an argument. Or, more generally, because it behooves us to get better at modeling people that don’t share our epistemology or our (at least, my) contempt for politics.
So, um, if you really let Jesus into your heart and accept Him as your personal savior you will see that He wants you to donate 50% of your salary to GiveWell’s top charities..?
True, but you don’t do that by mimicking their rhetoric.
The point isn’t to blindly mimic their rhetoric, it’s to talk their language: not just the soundbites, but the motivations under them. To use your example, talking about letting Jesus into your heart isn’t going to convince anyone to donate a large chunk of their salary to GiveWell’s top charities. There’s a Christian argument for charity already, though, and talking effective altruism in those terms might well convince someone that accepts it to donate to real charity rather than some godawful sad puppies fund; or to support or create Christian charities that use EA methodology, which given comparative advantage might be even better. But you’re not going to get there without understanding what makes Christian charity tick, and it’s not the simple utilitarian arguments that we’re used to in an EA context.
There is a price: to talk in their language is to accept their framework. If you are making an argument in terms of fighting the oppression of white male patriarchy, you implicitly agree that the white male patriarchy is in the business of oppression and needs to be fought. If you’re using the Christian argument for charity to talk effective altruism, you are implicitly accepting the authority of Jesus.
Yes, you are. That’s a price you need to pay if you want to get something out of mindkilled people, which incidentally tends to be the first step in introducing outside ideas and thereby making them less mindkilled. Reject it in favor of some kind of radical honesty policy, and unless you’re very lucky and very charismatic you’ll find yourself with no allies and few friends. But hey, you’ll have the moral high ground! I hear that and $1.50 will get you a cup of coffee.
(My argument in the ancestor wasn’t really about fighting the white male patriarchy, though; the rhetoric about that is just gingerbread, like appending “peace be upon him” to the name of the Prophet. It’s about the importance of subjective experience and a more general contrarianism—which are also SJ themes, just less obvious ones.)
Maybe it’s the price you need to pay, but I don’t see how being able to get something out of mindkilled people is the first step in making them less mindkilled. You got what you wanted and paid for it by reinforcing their beliefs—why would they become more likely to change them?
I am not going for radical honesty. What I’m suspicious of is using arguments which you yourself believe are bullshit and at the same time pretending to be a bona fide member of a tribe to which you don’t belong.
And, by the way, there seems to be a difference between Jesus and SJ here. When talking to a Christian I can be “radically honest” and say something along the lines “I myself am not a Christian but you are and don’t you recall how Jesus said that …”. But that doesn’t work with SJWs—if I start by saying “I myself don’t believe in while male oppression but you do and therefore you should conclude that...”, I will be immediately crucified for the first part and no one will pay any attention to the second.
You’re not substantially reinforcing their beliefs. Beliefs entangled with your identity don’t follow Bayesian rules: directly showing anything less than overpoweringly strong evidence against them (and even that isn’t a sure thing) tends to reinforce them by provoking rationalization, while accepting them is noise. If you don’t like Christianity, you wouldn’t want to use the Christian argument for charity with a weak or undecided Christian; but they aren’t going to be mindkilled in this regard, so it wouldn’t make a good argument anyway.
On the other hand, sneaking new ideas into someone’s internal memetic ecosystem tends to put stress on any totalizing identities they’ve adopted. For example, you might have to invoke God’s commandment to love thy neighbor as thyself to get a fundamentalist Christian to buy EA in the first place; but now they have an interest in EA, which could (e.g.) lead them to EA forums sharing secular humanist assumptions. Before, they’d have dismissed this as (e.g.) some kind of pathetic atheist attempt at constructing a morality in the absence of God. But now they have a shared assumption, a point of commonality. That’ll lead to cognitive dissonance, but only in the long run—timescales you can’t work on unless you’re very good friends with this person.
That cognitive dissonance won’t always resolve against Christianity, but sometimes it will. And when it doesn’t, you’ll usually still have left them with a more nuanced and less stereotypical Christianity.
Well, yes, if we’re talking about a single conversation, especially over the ’net, you are not going to affect much anything. Still, even if you do not reinforce then you confirm. And there are different ways to get mindkilled, entangling your identity with beliefs is only one of them...
True, but the same caveat applies—if we’re talking about one or two conversations you’re not going to produce much if any effect.
In any case, my line of thinking in this subthread wasn’t concerned so much with the effectiveness of deconversion, but rather was more about the willingness to employ arguments that you don’t believe but your discussion opponent might. I understand the need to talk to people in the language they understand, but there is a fine line to walk here.