I’m going to make a controversial suggestion: one useful target of tolerance might be religion.
I think we pretty much all understand that the supernatural is an open and shut case. Because of this, religion is a useful example of people getting things screamingly, disastrously wrong. And so we tend to use that as a pointer to more subtle ways of being wrong, which we can learn to avoid. This is good.
However, when we speak too frequently, and with too much naked disdain, of religion, these habits begin to have unintended negative effects.
It would be useful to have resources on general rationality to which to point our theist friends, in order to raise their overall level of sanity to the point where religion can fall away on its own. This is not going to work if these resources are blasting religion right from the get-go. Our friends are going to feel attacked, quickly close their browsers, and probably not be too well-disposed towards us the next time we speak (this may not be an entirely hypothetical example).
I’m not talking about respect. That would be far too much to ask. If we were to speak of religion as though it could genuinely be true, we would be spectacular liars. Still, not bringing up the topic when it’s not necessary, using another example if there happens to be one available, would, I think, significantly increase the potential audience for our writing.
The problem with tolerating religion is that, as Dawkins pointed out, it has received too much tolerance already. One reason religion is so widespread and obnoxious is that it has been so off limits to criticism for so long.
A good solution to this is to have some diversity of rhetoric. Some people can be blunt, others openly contemptuous, and others more friendly and overtly tolerant. There’s room enough for all of these.
The less tolerant people destroy the special immunity to criticism that religion has long enjoyed, and get to be seen as the “extremists”. Meanwhile they make the sweetness-and-light folks look more moderate by comparison, which is a useful thing. A lot of people reflexively reject extremism, which they define as simply the most extreme views that they’re hearing expressed on a contentious issue. Make the extremists more extreme, and more moderate versions of their viewpoint become more socially acceptable.
I’m very much in favor of what you wrote there. I’ve been thinking to start a separate thread about this some time. Though feel free to beat me to it, I won’t be ready to do so very soon anyway. But here’s a stab at what I’m thinking.
A note for theists: you will find LW overtly atheist. We are happy to have you participating, but please be aware that other commenters are likely to treat religion as an open-and-shut case. This isn’t groupthink; we really, truly have given full consideration to theistic claims and found them to be false.
This is fair. I could, in principle, sit down and discuss rationality with a group having such a disclaimer, except in favor of religion, assuming they got promoted to my attention for some unrelated good reason (like I’ve been linked to an article and read that one and two more and I found them all impressive). Not going to happen in practice, probably, but you get my drift.
Except that’s not the vibe of what Less Wrong is actually like, IMO, that we’re “happy to have” these people. Atheism strikes me as a belief that’s necessary for acceptance to the tribe. This is not a Good Thing, for many reasons, the simplest of which is that atheism is not rationality. Reversed stupidity is not intelligence; people can be atheists for stupid reasons, too. So seeing that atheism seems to be necessary here in order to follow our arguments and see our point, people will be suspicious of those arguments and points. If you can’t make your case about something that in principle isn’t about religion, without using religion in the reasoning, it’s probably not a good case.
What I’d advocate would be not using religion as examples of obvious inanity, in support of some other point, like in this, otherwise great, post:
Now I’m not in favor of censoring religion out and pretending we’re not 99% atheists here or whatever the figure is. If the topic of some article is tied to religion, then sure, anything goes—you’ll need good arguments anyway or you won’t have a post and people will call you on using applause lights instead of argumentation.
But, more subtly: if the topic is some bias or rationality tool, and religion is a good example of how that bias operates/tool fails to be applied, then go ahead and show that example after the bias/tool has already been convincingly established in more neutral terms. It’s one of the reasons why we explain Bayes’ theorem in terms of mammographies, not religion.
However, in some areas, it is particularly difficult to keep things separate. The two cultures are simply very different; discussions have a way of finding the largest differences.
To be more specific: a recent conversation about rationalism came to the point of whether we could depend on the universe not to kill us. (To put it as it was in the conversation: there must be justice in the universe.)
Well, I think you’re absolutely right except, perhaps, regarding the claim that “Atheism strikes me as a belief that’s necessary for acceptance to the tribe.” I’m not an atheist, and while when I mention this fact I get mobbed by people asking me to refute arguments I’ve heard a thousand times before, I’ve never found myself or seen others be rejected as members of the tribe for admitting to religious beliefs.
I can think of another 3 reasons to explain Bayes theorem in terms of mammograms (or “mammographies” if you prefer) - boobs, torture and the mathematical ignorance of physicians.
Tolerance is over-rated (although it’s a Masonic virtue so I’m supposed to like it): to me, the word has supercilious connotations—kind of “I’m going to permit you to persist in error, unmolested, coz I’m just that awesome”.
I prefer acceptance: after you have harangued someone with everything that’s wrong with their view of the problem, give up and accept that they’re idiots.
Tolerance is over-rated (although it’s a Masonic virtue so I’m supposed to like it): to me, the word has supercilious connotations—kind of “I’m going to permit you to persist in error, unmolested, coz I’m just that awesome”.
I prefer acceptance: after you have harangued someone with everything that’s wrong with their view of the problem, give up and accept that they’re idiots.
Firstly, that is the most blatant derailing of a thread I have ever seen.
Secondly, the main advantage of “tolerance” is that most people cannot, by definition, be in a better position to judge on certain issues than most other people—and indeed will almost certainly be wrong about at least some of their beliefs. Thus, it is irrational to impose your beliefs on others if you have no reason to think you are more rational then they are (see also Auman’s Agreement Theorem.) Of course, it is also irrational to believe you are right in this situation, but at least it’s not harming people.
The most extreme example of this principle would be someone programming in their beliefs regarding morality directly into a Seed AI. Since they are almost certainly wrong about something, the AI will then proceed to destroy the world and tile the universe with orgasmium or whatever.
What was the title of the post? Something about tolerance, if I’m not mistaken.
As to your ‘secondly’ point… I absolutely agree with the statement that “most people cannot, by definition, be in a better position to judge on certain issues than most other people” (emphasis mine—in fact I would extend that to say on most issues of more than minimal complexity).
Absolutely key point to bear in mind is that if you harangue someone about a problem when you’re not in a better position to judge on that particular issue, you’re being an asshat. That’s why I tend to limit my haranguing to matters of (deep breath)...
Economics (in which I have a double-major First, with firsts in Public Finance, Macro, Micro, Quantitative Economic Policy, International Economics, Econometric Theory and Applied Econometrics) and
Econometrics (and the statistical theory underpinning it) for which I took straight Firsts at Masters;
Quantitative analysis of economic policy (and economic modelling generally). which I did for a living for half a decade and taught to undergraduates (3rd year and Honours).
I babble with muted authority on
expectations (having published on, and having been asked to advise my nation’s Treasury on, modelling them in financial markets within macroeconometric models), and
the modelling paradigm in general (having worked for almost a decade at one of the world’s premier economic modelling think tanks, and having dabbled in a [still-incomplete] PhD in stochastic simulation using a computable general-equilibrium model).
And yet I constantly find myself being told things about economics, utility maximisation, agency problems, and so forth, by autodidacts who think persentio ergo rectum is a research methodology.
What was the title of the post? Something about tolerance, if I’m not mistaken.
So why not comment on the post, hmm?
Absolutely key point to bear in mind is that if you harangue someone about a problem when you’re not in a better position to judge on that particular issue, you’re being an asshat.
Oh, of course. If you genuinely have good reason to believe you know better than (group) beyond the evidence you have that you are right then it is perfectly reasonable to act on it. But since most of the time you’re probably not in that position, it seems to me that cultivating tolerance is a good idea.
I’m going to make a controversial suggestion: one useful target of tolerance might be religion.
I’ll try to tolerate your tolerance.
(I blog using any examples that come to hand, but when I canonicalize I try to remove explicit mentions of religion where possible. Bear in mind that intelligent religious people with Escher-minds will see the implications early on, though.)
The term “canonical” has a somewhat different definition in the fields of math and computer science. Eliezer is probably using it influenced by this definition, in the sense of “converting his writing into canonical form”, as opposed to an ad-hoc or temporary form. In my experience, the construction “canonicalize” refers almost exclusively to this sense of the word.
I think you point up the problem with your own suggestion—we have to have examples of rationality failure to discuss, and if we choose an example on which we agree less (eg something to do with AGW) then we will end up discussing the example instead of what it is intended to illustrate. We keep coming back to religion not just because practically every failure of rationality there is has a religious example, but because it’s something we agree on.
we have to have examples of rationality failure to discuss
It should be noted that if all goes according to plan, we won’t have religion as a relevant example for too much longer. One day (I hope) we will need to teach rationality without being able to gesture out the window at a group of intelligent adults who think crackers turn into human flesh on the way down their gullets.
Why not plan ahead?
ETA: Now I think of it, crackers do, of course, turn into human flesh, it just happens a bit later.
It’s not so much that I’m trying to hide my atheism, or that I worry about offending theists—then I wouldn’t speak frankly online. The smart ones are going to notice, if you talk about fake explanations, that this applies to God; and they’re going to know that you know it, and that you’re an atheist. Admittedly, they may be much less personally offended if you never spell out the application—not sure why, but that probably is how it works.
And I don’t plan far enough ahead for a day when religion is dead, because most of my utility-leverage comes before then.
But rationality is itself, not atheism or a-anything; and therefore, for aesthetic reasons, when I canonicalize (compile books or similar long works), I plan to try much harder to present what rationality is, and not let it be a reaction to or a refutation of anything.
they may be much less personally offended if you never spell out the application—not sure why, but that probably is how it works.
Once you connect the dots and make the application explicit, they feel honor-bound to take offense and to defend their theism, regardless of whether they personally want to take offense or not. In their mind, making the application explicit shifts the discussion from being about ideas to being about their core beliefs and thus about their person.
If all goes according to plan, by then we will be able to bring up more controversial examples without debate descending into nonsense. Let’s cross that bridge when we come to it.
I think there are other examples with just as much agreement on their wrongness, many of which have a much lower degree of investment even for their believers. Astrology for instance has many believers, but they tend to be fairly weak beliefs, and don’t produce such a defensive reaction when criticized. Lots of other superstitions also exist, so sadly I don’t think we’ll run out of examples any time soon.
But because people aren’t so invested in it, they mostly won’t work so hard to rationalise it; mostly people who are really trying to be rational will simply drop it, and you’re left with a fairly flabby opposition. Whereas lots of smart people who really wanted to be clear-thinking have fought to hang onto religion, and built huge castles of error to defend it.
I’m going to make a controversial suggestion: one useful target of tolerance might be religion.
I think we pretty much all understand that the supernatural is an open and shut case. Because of this, religion is a useful example of people getting things screamingly, disastrously wrong. And so we tend to use that as a pointer to more subtle ways of being wrong, which we can learn to avoid. This is good.
However, when we speak too frequently, and with too much naked disdain, of religion, these habits begin to have unintended negative effects.
It would be useful to have resources on general rationality to which to point our theist friends, in order to raise their overall level of sanity to the point where religion can fall away on its own. This is not going to work if these resources are blasting religion right from the get-go. Our friends are going to feel attacked, quickly close their browsers, and probably not be too well-disposed towards us the next time we speak (this may not be an entirely hypothetical example).
I’m not talking about respect. That would be far too much to ask. If we were to speak of religion as though it could genuinely be true, we would be spectacular liars. Still, not bringing up the topic when it’s not necessary, using another example if there happens to be one available, would, I think, significantly increase the potential audience for our writing.
The problem with tolerating religion is that, as Dawkins pointed out, it has received too much tolerance already. One reason religion is so widespread and obnoxious is that it has been so off limits to criticism for so long.
A good solution to this is to have some diversity of rhetoric. Some people can be blunt, others openly contemptuous, and others more friendly and overtly tolerant. There’s room enough for all of these.
The less tolerant people destroy the special immunity to criticism that religion has long enjoyed, and get to be seen as the “extremists”. Meanwhile they make the sweetness-and-light folks look more moderate by comparison, which is a useful thing. A lot of people reflexively reject extremism, which they define as simply the most extreme views that they’re hearing expressed on a contentious issue. Make the extremists more extreme, and more moderate versions of their viewpoint become more socially acceptable.
Someone has to play the villains in this story.
I’m very much in favor of what you wrote there. I’ve been thinking to start a separate thread about this some time. Though feel free to beat me to it, I won’t be ready to do so very soon anyway. But here’s a stab at what I’m thinking.
This is from the welcome thread:
This is fair. I could, in principle, sit down and discuss rationality with a group having such a disclaimer, except in favor of religion, assuming they got promoted to my attention for some unrelated good reason (like I’ve been linked to an article and read that one and two more and I found them all impressive). Not going to happen in practice, probably, but you get my drift.
Except that’s not the vibe of what Less Wrong is actually like, IMO, that we’re “happy to have” these people. Atheism strikes me as a belief that’s necessary for acceptance to the tribe. This is not a Good Thing, for many reasons, the simplest of which is that atheism is not rationality. Reversed stupidity is not intelligence; people can be atheists for stupid reasons, too. So seeing that atheism seems to be necessary here in order to follow our arguments and see our point, people will be suspicious of those arguments and points. If you can’t make your case about something that in principle isn’t about religion, without using religion in the reasoning, it’s probably not a good case.
What I’d advocate would be not using religion as examples of obvious inanity, in support of some other point, like in this, otherwise great, post:
http://lesswrong.com/lw/1j7/the_amanda_knox_test_how_an_hour_on_the_internet/
Now I’m not in favor of censoring religion out and pretending we’re not 99% atheists here or whatever the figure is. If the topic of some article is tied to religion, then sure, anything goes—you’ll need good arguments anyway or you won’t have a post and people will call you on using applause lights instead of argumentation.
But, more subtly: if the topic is some bias or rationality tool, and religion is a good example of how that bias operates/tool fails to be applied, then go ahead and show that example after the bias/tool has already been convincingly established in more neutral terms. It’s one of the reasons why we explain Bayes’ theorem in terms of mammographies, not religion.
Feedback would be welcome.
I think this is a good analysis.
However, in some areas, it is particularly difficult to keep things separate. The two cultures are simply very different; discussions have a way of finding the largest differences.
To be more specific: a recent conversation about rationalism came to the point of whether we could depend on the universe not to kill us. (To put it as it was in the conversation: there must be justice in the universe.)
Well, I think you’re absolutely right except, perhaps, regarding the claim that “Atheism strikes me as a belief that’s necessary for acceptance to the tribe.” I’m not an atheist, and while when I mention this fact I get mobbed by people asking me to refute arguments I’ve heard a thousand times before, I’ve never found myself or seen others be rejected as members of the tribe for admitting to religious beliefs.
I can think of another 3 reasons to explain Bayes theorem in terms of mammograms (or “mammographies” if you prefer) - boobs, torture and the mathematical ignorance of physicians.
Tolerance is over-rated (although it’s a Masonic virtue so I’m supposed to like it): to me, the word has supercilious connotations—kind of “I’m going to permit you to persist in error, unmolested, coz I’m just that awesome”.
I prefer acceptance: after you have harangued someone with everything that’s wrong with their view of the problem, give up and accept that they’re idiots.
Firstly, that is the most blatant derailing of a thread I have ever seen.
Secondly, the main advantage of “tolerance” is that most people cannot, by definition, be in a better position to judge on certain issues than most other people—and indeed will almost certainly be wrong about at least some of their beliefs. Thus, it is irrational to impose your beliefs on others if you have no reason to think you are more rational then they are (see also Auman’s Agreement Theorem.) Of course, it is also irrational to believe you are right in this situation, but at least it’s not harming people.
The most extreme example of this principle would be someone programming in their beliefs regarding morality directly into a Seed AI. Since they are almost certainly wrong about something, the AI will then proceed to destroy the world and tile the universe with orgasmium or whatever.
What was the title of the post? Something about tolerance, if I’m not mistaken.
As to your ‘secondly’ point… I absolutely agree with the statement that “most people cannot, by definition, be in a better position to judge on certain issues than most other people” (emphasis mine—in fact I would extend that to say on most issues of more than minimal complexity).
Absolutely key point to bear in mind is that if you harangue someone about a problem when you’re not in a better position to judge on that particular issue, you’re being an asshat. That’s why I tend to limit my haranguing to matters of (deep breath)...
Economics (in which I have a double-major First, with firsts in Public Finance, Macro, Micro, Quantitative Economic Policy, International Economics, Econometric Theory and Applied Econometrics) and
Econometrics (and the statistical theory underpinning it) for which I took straight Firsts at Masters;
Quantitative analysis of economic policy (and economic modelling generally). which I did for a living for half a decade and taught to undergraduates (3rd year and Honours).
I babble with muted authority on
expectations (having published on, and having been asked to advise my nation’s Treasury on, modelling them in financial markets within macroeconometric models), and
the modelling paradigm in general (having worked for almost a decade at one of the world’s premier economic modelling think tanks, and having dabbled in a [still-incomplete] PhD in stochastic simulation using a computable general-equilibrium model).
And yet I constantly find myself being told things about economics, utility maximisation, agency problems, and so forth, by autodidacts who think persentio ergo rectum is a research methodology.
So why not comment on the post, hmm?
Oh, of course. If you genuinely have good reason to believe you know better than (group) beyond the evidence you have that you are right then it is perfectly reasonable to act on it. But since most of the time you’re probably not in that position, it seems to me that cultivating tolerance is a good idea.
I’ll try to tolerate your tolerance.
(I blog using any examples that come to hand, but when I canonicalize I try to remove explicit mentions of religion where possible. Bear in mind that intelligent religious people with Escher-minds will see the implications early on, though.)
You canonicalize?
Where can we find your canon, and is it marked as canonical?
This might (partly) answer your question:
http://www.overcomingbias.com/2007/09/why-im-blooking.html
So he means a future canon? I can’t go somewhere today and find it?
(I disapprove of anyone calling some of their own non-fiction works ‘canonical’, but without conviction, never having thought about it before.)
The term “canonical” has a somewhat different definition in the fields of math and computer science. Eliezer is probably using it influenced by this definition, in the sense of “converting his writing into canonical form”, as opposed to an ad-hoc or temporary form. In my experience, the construction “canonicalize” refers almost exclusively to this sense of the word.
See the Jargon File entry for clarification.
Sadly true.
I think you point up the problem with your own suggestion—we have to have examples of rationality failure to discuss, and if we choose an example on which we agree less (eg something to do with AGW) then we will end up discussing the example instead of what it is intended to illustrate. We keep coming back to religion not just because practically every failure of rationality there is has a religious example, but because it’s something we agree on.
It should be noted that if all goes according to plan, we won’t have religion as a relevant example for too much longer. One day (I hope) we will need to teach rationality without being able to gesture out the window at a group of intelligent adults who think crackers turn into human flesh on the way down their gullets.
Why not plan ahead?
ETA: Now I think of it, crackers do, of course, turn into human flesh, it just happens a bit later.
It’s not so much that I’m trying to hide my atheism, or that I worry about offending theists—then I wouldn’t speak frankly online. The smart ones are going to notice, if you talk about fake explanations, that this applies to God; and they’re going to know that you know it, and that you’re an atheist. Admittedly, they may be much less personally offended if you never spell out the application—not sure why, but that probably is how it works.
And I don’t plan far enough ahead for a day when religion is dead, because most of my utility-leverage comes before then.
But rationality is itself, not atheism or a-anything; and therefore, for aesthetic reasons, when I canonicalize (compile books or similar long works), I plan to try much harder to present what rationality is, and not let it be a reaction to or a refutation of anything.
Writing that way takes more effort, though.
they may be much less personally offended if you never spell out the application—not sure why, but that probably is how it works.
Once you connect the dots and make the application explicit, they feel honor-bound to take offense and to defend their theism, regardless of whether they personally want to take offense or not. In their mind, making the application explicit shifts the discussion from being about ideas to being about their core beliefs and thus about their person.
For me, this appears to be correct.
If all goes according to plan, by then we will be able to bring up more controversial examples without debate descending into nonsense. Let’s cross that bridge when we come to it.
I think there are other examples with just as much agreement on their wrongness, many of which have a much lower degree of investment even for their believers. Astrology for instance has many believers, but they tend to be fairly weak beliefs, and don’t produce such a defensive reaction when criticized. Lots of other superstitions also exist, so sadly I don’t think we’ll run out of examples any time soon.
But because people aren’t so invested in it, they mostly won’t work so hard to rationalise it; mostly people who are really trying to be rational will simply drop it, and you’re left with a fairly flabby opposition. Whereas lots of smart people who really wanted to be clear-thinking have fought to hang onto religion, and built huge castles of error to defend it.