1) I think your reaction to this situation owed itself more to your psychological peculiarities as a person (whichever they are) than to a characteristic that all people that identify as rationalists share. There’s no reason to expect people with the same beliefs as yours never to keep their cool (at least never on the first time) when talking to someone with an obviously incompatible belief system.
2)
It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.
It doesn’t have to be like that, at least if you don’t start off with consistent and false belief systems. The way I think about such issues while effectively avoiding epistemological crises is the following: an algorithm by which I arrived at conclusions which I can pretty confidently dub “knowledge” ends up being added to my cognitive toolbox. There are many things out there that look like they were designed to be added to people’s cognitive toolboxes, but not all of them can be useful, can they? Some of them look like they were specifically designed to smash other such tools to pieces. So here’s a good rule of thumb: don’t add anything to your cognitive toolbox that looks like an “anti-tool” to a tool that is already inside of it. Anything that you suspect makes you know less, be dumber, or require you to forsake trustworthy tools is safe & recommendable to ignore. (In keeping with the social justice topic, a subcategory of bad beliefs to incorporate are those that cause you to succumb to, rather than resist, what you know to be flaws in your cognitive hardware, such as an ingroup-outgroup bias or affect heuristics—that’s why, I think, one should avoid getting too deep into the “privilege” crowd of social justice even if the arguments make sense to one.) Of course, you should periodically empty out the toolbox and see whether the tools are in a good state, or if there’s an upgraded version available, or if you were simply using the wrong hammer all along—but generally, rely on them.
3) You like to explore the implications of a premise, which is completely incompatible with your friend’s “separate magisteria” approach (a technique directly out of the Official Handbook of Belief Conservation); unfortunately it is why you weren’t able to abandon the train of thought before it derailed into emotional disturbance. You see someone saying you shouldn’t use an obviously (to you) useful and relevant method for investigating something? That’s a sign that says “Stop right here, there’s no use in trying to extrapolate the consequences of this belief of theirs; they obviously haven’t thought about it in sufficient detail to form opinions on it that you can make head or tails of.” The knowledge and deepness of thought that it takes to see why math is relevant to understanding society is small enough that, if they failed to catch even that, they obviously went no further in establishing beliefs about math that could be either consistent or inconsistent with the pursuit of justice and equality. You went as far as seeing the implications and being horrified—“How can anyone even think that?”—but it is a thought they likely didn’t get to think; the ramifications of their thought about math ended long before that, presumably at the point when it began to interfere with ideological belief conservation.
4) Get better friends. I know the type, and I’ve learned the hard way not to try to reason with them. Remember that one about playing chess with a pidgeon?
So here’s a good rule of thumb: don’t add anything to your cognitive toolbox that looks like an “anti-tool” to a tool that is already inside of it. Anything that you suspect makes you know less, be dumber, or require you to forsake trustworthy tools is safe & recommendable to ignore. (In keeping with the social justice topic, a subcategory of bad beliefs to incorporate are those that cause you to succumb to, rather than resist, what you know to be flaws in your cognitive hardware, such as an ingroup-outgroup bias or affect heuristics—that’s why, I think, one should avoid getting too deep into the “privilege” crowd of social justice even if the arguments make sense to one.)
Why is privilege such a dangerous idea? I suspect that your answer is along the lines of “A main tenet of privilege theory is that privileged people do not understand how society really works (they don’t experience discrimination, etc.), therefore it can make you despair of ever figuring anything out, and this is harmful.” But reading about cognitive biases can have a similar effect. Why is learning about bias due to privilege especially harmful to your cognitive toolbox?
No, it’s not that. It’s that there are many bugs of the human mind which identity politics inadvertently exploits. For one, there’s the fact that it provides convenient ingroups / outgroups for people to feel good, respectively bad, about—the privileged and the oppressed—and these outgroups are based on innate characteristics. Being non-white, female, gay etc. wins you points with the social justice crowd just as being white, male, straight etc. loses you points. Socially speaking, how much a “social justice warrior” likes you is partly a function of how many disadvantaged groups you belong to. This shouldn’t happen, maybe not even in accordance to the more academic, theoretical side of social justice, but it does, because we’re running on corrupted hardware and these theories fail to compensate for it.
Another very closely related problem is “collecting injustices”. You can transform everything bad that happens to you for a cause that you perceive to be your belonging to an oppressed group into debate ammunition against the other; you can point to it to put yourself in a positive, sympathetic, morally superior light, and your opponents in a negative light. So there’s this powerful rhetorical upside to being in a situation that otherwise can only be seen as a very shitty situation to be in. This incentivizes people, on some level, to not really seek to minimize these situations. But obviously people hate oppression and don’t actually, honestly want to experience it, but winning debates automatically and gaining the right to pontificate feels good. So what to do? Lower the threshold for what counts as oppression, obviously. This has absolutely disastrous effects on their stated goals. If there’s anything whatsoever that incentivizes you to find more oppression in the world around you, you can’t sincerely pursue the goal of ending oppression.
Also, some of the local memes instruct people to lift all the responsibility of a civilized discussion off themselves and put it on the other. Yvain had a post on his LJ which described this mode of discussion as a “superweapon”. Also, see this page (a favorite of the internet social justice advocates that I had the unpleasantness of running into) for getting a good idea about the debate rights claimed by many of them, and the many responsibilities of which they absolve themselves. If that doesn’t look like mindkilling, I don’t know what does.
Simply put, many people like this ideology because it gives them an opportunity to revel in their self-righteousness. Of course, it’s good for people to know whether they have their cultural blinders on in specific situations; it’s also very bad for people to vilify an entire race or sex or whatever. The tricky thing to do is to clear your mind of your identity-induced biases without adopting an ideology that, overall, has a great chance of making you more irrational than before.
1) I think your reaction to this situation owed itself more to your psychological peculiarities as a person (whichever they are) than to a characteristic that all people that identify as rationalists share. There’s no reason to expect people with the same beliefs as yours never to keep their cool (at least never on the first time) when talking to someone with an obviously incompatible belief system.
2)
It doesn’t have to be like that, at least if you don’t start off with consistent and false belief systems. The way I think about such issues while effectively avoiding epistemological crises is the following: an algorithm by which I arrived at conclusions which I can pretty confidently dub “knowledge” ends up being added to my cognitive toolbox. There are many things out there that look like they were designed to be added to people’s cognitive toolboxes, but not all of them can be useful, can they? Some of them look like they were specifically designed to smash other such tools to pieces. So here’s a good rule of thumb: don’t add anything to your cognitive toolbox that looks like an “anti-tool” to a tool that is already inside of it. Anything that you suspect makes you know less, be dumber, or require you to forsake trustworthy tools is safe & recommendable to ignore. (In keeping with the social justice topic, a subcategory of bad beliefs to incorporate are those that cause you to succumb to, rather than resist, what you know to be flaws in your cognitive hardware, such as an ingroup-outgroup bias or affect heuristics—that’s why, I think, one should avoid getting too deep into the “privilege” crowd of social justice even if the arguments make sense to one.) Of course, you should periodically empty out the toolbox and see whether the tools are in a good state, or if there’s an upgraded version available, or if you were simply using the wrong hammer all along—but generally, rely on them.
3) You like to explore the implications of a premise, which is completely incompatible with your friend’s “separate magisteria” approach (a technique directly out of the Official Handbook of Belief Conservation); unfortunately it is why you weren’t able to abandon the train of thought before it derailed into emotional disturbance. You see someone saying you shouldn’t use an obviously (to you) useful and relevant method for investigating something? That’s a sign that says “Stop right here, there’s no use in trying to extrapolate the consequences of this belief of theirs; they obviously haven’t thought about it in sufficient detail to form opinions on it that you can make head or tails of.” The knowledge and deepness of thought that it takes to see why math is relevant to understanding society is small enough that, if they failed to catch even that, they obviously went no further in establishing beliefs about math that could be either consistent or inconsistent with the pursuit of justice and equality. You went as far as seeing the implications and being horrified—“How can anyone even think that?”—but it is a thought they likely didn’t get to think; the ramifications of their thought about math ended long before that, presumably at the point when it began to interfere with ideological belief conservation.
4) Get better friends. I know the type, and I’ve learned the hard way not to try to reason with them. Remember that one about playing chess with a pidgeon?
Why is privilege such a dangerous idea? I suspect that your answer is along the lines of “A main tenet of privilege theory is that privileged people do not understand how society really works (they don’t experience discrimination, etc.), therefore it can make you despair of ever figuring anything out, and this is harmful.” But reading about cognitive biases can have a similar effect. Why is learning about bias due to privilege especially harmful to your cognitive toolbox?
No, it’s not that. It’s that there are many bugs of the human mind which identity politics inadvertently exploits. For one, there’s the fact that it provides convenient ingroups / outgroups for people to feel good, respectively bad, about—the privileged and the oppressed—and these outgroups are based on innate characteristics. Being non-white, female, gay etc. wins you points with the social justice crowd just as being white, male, straight etc. loses you points. Socially speaking, how much a “social justice warrior” likes you is partly a function of how many disadvantaged groups you belong to. This shouldn’t happen, maybe not even in accordance to the more academic, theoretical side of social justice, but it does, because we’re running on corrupted hardware and these theories fail to compensate for it.
Another very closely related problem is “collecting injustices”. You can transform everything bad that happens to you for a cause that you perceive to be your belonging to an oppressed group into debate ammunition against the other; you can point to it to put yourself in a positive, sympathetic, morally superior light, and your opponents in a negative light. So there’s this powerful rhetorical upside to being in a situation that otherwise can only be seen as a very shitty situation to be in. This incentivizes people, on some level, to not really seek to minimize these situations. But obviously people hate oppression and don’t actually, honestly want to experience it, but winning debates automatically and gaining the right to pontificate feels good. So what to do? Lower the threshold for what counts as oppression, obviously. This has absolutely disastrous effects on their stated goals. If there’s anything whatsoever that incentivizes you to find more oppression in the world around you, you can’t sincerely pursue the goal of ending oppression.
Also, some of the local memes instruct people to lift all the responsibility of a civilized discussion off themselves and put it on the other. Yvain had a post on his LJ which described this mode of discussion as a “superweapon”. Also, see this page (a favorite of the internet social justice advocates that I had the unpleasantness of running into) for getting a good idea about the debate rights claimed by many of them, and the many responsibilities of which they absolve themselves. If that doesn’t look like mindkilling, I don’t know what does.
Simply put, many people like this ideology because it gives them an opportunity to revel in their self-righteousness. Of course, it’s good for people to know whether they have their cultural blinders on in specific situations; it’s also very bad for people to vilify an entire race or sex or whatever. The tricky thing to do is to clear your mind of your identity-induced biases without adopting an ideology that, overall, has a great chance of making you more irrational than before.
Accidentally retracted because I can’t into formatting; please ignore and see the other post in this subthread. —Dahlen