I think if you drop the view that door-to-door evangelism is about conversation and see it mostly as a retention mechanism instead, it addresses some of the issues in the article.
The statistics on the number of people converted to a religion because someone came to their door and argued with them are pitiful. If you spend two years trying to convert people that way, the median expectation is “no conversions” (one or two is substantial success, and more than that is a crazy outlier). If your goal is actually to change people’s minds your approach has to be much more subtle. It requires building a relationship of trust, respect, and care for a long time before a normal person will take your words as worthwhile evidence for updating their life philosophy.
Far more common are vegans who are adamantly non-condemnatory. They may abstain from using any sort of animal products on strictly moral grounds, but, they will defensively assert, they’re not going to criticize anyone else for doing otherwise.
As near as I can tell, this is pretty much the correct strategy for honesty convincing someone to adopt the practices suggested by a belief, whether its a matter of Veganism or Mormonism. My understanding from reading about the sociology of evangelism is that the Mormons give pamphlets to the laity that tell them not to proselyze to friends, co-workers, neighbors etc. Their job is to be friendly, helpful, and admirable while deflecting questions about their beliefs as “helpful for me but not that important” until someone is so impressed by the quality of their lives and how nice they are at a BBQ that they ask two or three times about their beliefs. Then the family invites the interested party (usually an entire family, with bonds of friendship that run husband-to-husband, wife-to-wife, kids-to-kids) to a church social event to help them see how things work. If the family is still interested, the believing family doesn’t “close the sale” itself—they invite the interested family over to talk with a specialized member of the church who can help the interested family formally articulate their by-now-pre-existing interest in joining the church.
In the meantime, from what I can tell, the real adaptive function of going door to door is that it exposes a new adult believer to a lot of debate-style conversation about the idea they are promoting. In the absence of effective epistemic warnings to the contrary they become more committed to their idea via massivecommitment and consistency, and various other cognitive biases. This makes them less likely to feel that the belief is just a matter of lip service, which would make a lapse in belief more likely in the absence of the social processes of regular church attendance.
One of my pet hypotheses for “where rationalists come from” is that many of us engage in debate-style conversations (similar to proselytizing) for fun when we are young in the absence of any coherent doctrine to defend that has been handed down to us by an institution. This practice teaches us respect for the usefulness of defending “the right content” as a helpful technique for the intrinsically valuable outcome of winning a debate. Then epistemic rationality is intelligible as a truth-biased (and hence nominally moral) method to find the content and justifications that will be useful for winning future debates… then you have enough to bootstrap into logical fallacies and cost-benefit-analysis and so on. This isn’t a very admirable theory, but if it was true then it seems helpful to know if you’re trying to teach or find rationalists.
Also, if this hypothesis is on the mark it probably has sociological implications: If someone gets an emotional boost from winning a debate then someone else is going to lose a debate and get negative reinforcement for the same verbal interactions. Unless this process was engineered somehow (with people losing on purpose and not taking it hard?) I suspect that rationalists produced by this method can never be more than a fraction of the population, and they will necessarily be intermixed with people for whom debate has almost entirely negative emotional associations.
From this (admittedly very weird) perspective, evangelists for false doctrines may unwittingly be performing an epistemic public service that, on net, raises the sanity waterline :-)
If you spend two years trying to convert people that way, the median expectation is “no conversions” (one or two is substantial success, and more than that is a crazy outlier).
That’s useful information (for the cultishness discussion thread). Do you have a cite to hand?
I must admit, I’d assumed this method had some success, enough to bother. So I’m glad we don’t have to fear rationalists going door to door waking people too early on Saturday morning and saying “Say, friend, have you read the Sequences?”
Searching for confirmation on google I found different statistics quoted in an article about relative religious growth rates here. The source may have incentives to inflate their numbers but they claim:
The average missionary in 1989 brought 9.1 people into the church, while in 2000 the average missionary brought 4.6 people into the church. When one accounts for actual activity and retention rates, with the great majority of LDS convert growth occurring in Latin America and other areas with low retention and only 20-25% of convert growth occurring in North America, it can be determined that of the 4.6 persons baptized by the average missionary each year, approximately 1.3 will remain active.
I think the book would be helpful for the “cultishness discussion” here but also for the general “rapture of the nerds” critique aimed at the singularity hypothesis when people first encounter it. One helpful thing about The Future Of Religion is to dissolve the confusion that comes up from the conjunction of particular supernatural theories and particular sociological processes.
Sloppy thinking can lead to accusations that some political movements or psycho-therapeutic fads are “cults” when there is no substantial religious doctrine, just a certain set of “human tribal instincts” being deployed in a manner that is characteristic of many voluntary social processes. The LaRouche Movement is an instructive edge case, which is sometimes accused of being a “political cult”.
Any conscious/rational human improvement system probably will have to make use of some of these basic mechanisms if it is to be effective, and the question (as always) is simply whether they are being used for good ends, given the actual state of the world.
Another useful item for the cultishness discussion might be the Bonewits’ Cult Danger Evaluation Frame because it encodes a useful set of symptoms that predict whether something is going seriously wrong with a group… or at least signal that is is “playing with fluorine”. There are reasons to play with fluorine, but not many good reasons.
Those numbers and references are utterly wonderful, thank you!
Calling something with no substantial religious doctrine a “cult” is not a category error for the thing I’m talking about as the word “cult”. It’s not my private definition either. It’s a particular sociological phenomenon in a group. It’s the thing someone is worried about when they say “Has Bob joined some sort of cult?” I need to actually describe what I’m talking about, which I’m not sure how to unpack with the requisite accuracy. But LaRouche is right in there, and Amway is too for another one.
Now to relate that more specifically to predatory infectious memes. And dig up all the stuff I read ten or so years ago.
Yeah, I read into that stuff largely out of a curiosity about memetics too :-)
It’s the thing someone is worried about when they say “Has Bob joined some sort of cult?”
I like “the mom test” for that. If you’re hanging out with a group based around a common set of beliefs that are taken by the group to be formally true, would you be too embarrassed to ask your mom for advice about the group or the beliefs? If so, then for you (given your parents and their community and so on) it’s close enough to a cult that you really should stay away (unless you’re just doing participant observation as a research project). On the other hand, if you’re not too embarrassed, then you really should actually go ask her for advice, because that kind of practical and emotionally grounded feedback is good input for people to be mindful of, even if it isn’t always perfect.
This is a primary reason I recommend that people talk to one or both parents about SIAI if they start suspecting that they should get personally involved with existential risk activism and FAI and so on :-)
I’m personally not at all surprised that the success rates are so low. If the evangelists had actually internalized the idea that they’re doing it to save people from hell, they’d take it much more seriously. Instead, it’s generally framed as a matter of duty to the religious community, or a sign of personal virtue for making the effort at all. It doesn’t need to have a significant success rate to sustain itself, although I imagine that if it were inefficient enough that most door to door evangelists had never heard of a successful conversion, they might reevaluate their methods.
If you really wanted to convert others though, not merely to do the most that was comfortable, or convince yourself you had made an honest effort, I think that the mainstream Mormon approach would not be the most effective method. Or rather, it doesn’t take the approach far enough. Maybe I’m merely projecting an atypical attitude, but I think it would be far more effective to dedicate your life to moral causes. Give away everything you own, work your hands to the bone to give away more, try to set a standard that would make Ghandi look undercommitted. The number of people who will be impressed and interested in the beliefs of a mere upstanding community member is nothing compared to that which would be interested in a moral paragon.
Of course, one might argue that people will be driven off if they suspect that the religious beliefs demand too much of them, but while religious believers tend to claim status from the efforts of the most exemplary members of their faith, they rarely try to meet their standards. Also, I’m highly skeptical of any argument that states that the most effective approach conveniently intersects with what is most comfortable.
I’ll second a part of your post: some people simply don’t like debate (either debate in person or argumentative writing.) Communicating with a non-debater about a debate-like topic is very strange; it’s like trying to fence against an opponent wielding a bowl of Jello instead of a sword. LessWrong has a number of essays about the ways people debate irrationally (arguments as soldiers, dark side epistemology, etc.) but I think far more of the population just doesn’t debate at all, and will respond blankly if you try to debate them.
People without a debate mindset might think “Oh, I’m not very good at arguments.” People with a debate mindset, I believe, don’t ever think in those terms: if we believe something strongly, it’s because we think we have a good argument for it, and if we don’t have a good argument for it, then we don’t have particularly strong convictions about it. But a non-debater can both believe something deeply and believe she could be out-argued by the opposing view.
For what it’s worth: I consider myself a “debater” in the sense you mean, but there are plenty of things where I believe them, I feel strongly about them, and I believe I could be out-argued by a sufficiently clever articulation of an opposing view, even if that view was wrong.
Out-argued in what sense? Do you think that you wouldn’t be able to see why their arguments were wrong, or just that you wouldn’t be able to persuade an impartial audience that you were more right?
On subjects that I hold strong beliefs in, I anticipate that I could not be out-argued in the first sense. If someone was able to offer such arguments, I would either have to conclude that they were right, or that I didn’t understand the topic as well as I thought I did in the first place, and would have to revise the strength of my beliefs.
I’m not certain that the dividing line between those two senses is as crisp as you make it sound, but I guess I mean something like the latter sense. That is, I can imagine someone articulating their arguments for -P in such a way that their arguments are more compelling than mine are for P, even when P is true.
The dividing line comes from the fact that an impartial audience is not at all the same thing as a rational audience, and there’s a lot more to rhetoric than making arguments that are logically sound and tenable.
My general defence against this is to be too difficult to actually convince. I nod and smile and acknowledge the quality of the arguments but am not actually convinced to change my mind. I may well have taken this too far. (It certainly frustrates the heck out of people.) It’s useful if you know you’re fond enough of new ideas to be susceptible to neophilia-induced bad ideas. It’s somewhat like being just plain dim.
Ah. It sounds like we have different interpretations of what SarahC meant by out-argued.
I don’t believe a clever debater can long-term convince me of the falsehood of something I believe and feel strongly about (sadly, even if it’s true), although they might induce me to go along temporarily.
This is, incidentally, not to say that I can’t be caught up in cultishness, merely to say that clever arguments are sufficient (or, sadly, necessary) to do it. (ETA: er… I meant, of course, “are not sufficient,” which was perhaps clear)
I think there is a difference between (1) people who enjoy verbally competing to be correct, (2) people who enjoy such competition and have formally trained, studied, and practiced it, and (3) people who have trained, studied, practiced and come out disillusioned with strategic communication. When I was 10 I was the first kind of debater. I think many law school students, by stereotype, are the second kind. Having spent two years in college on a policy debate team and later coached a high school debate team I find myself feeling kinship with the third group.
I think formal competitive debate experience plus philosophy can help to calibrate people in very useful ways. Of a proposition Q you can ask what P(Q|H) is for various values of H if you’re sticking to pure Bayesian rationality. You can also have a sense of how difficult it would be to go aff (or neg) on Q (or not-Q) in front of different audiences.
Among skilled debaters where both sides have a research library and knowledge of Q in advance and skilled debaters are the audience, whoever goes aff and controls the content of Q should generally win.
But a good debater should also be able to take a large number of “open questions” as Q, and win in all four scenarios (aff/neg x Q/not-Q) with an arbitrary audience against an unskilled opponent. Watching this second thing happen, and learning to do it, and teaching other people to do it has given me relatively little respect for casual debates as a truth seeking process, but a lot of respect for formal debates as an educational process.
If anyone reading this is picking colleges, I recommend looking for one with a CEDA program and spending a year on the team :-)
I agree that casual debate isn’t so great as a truth seeking process.
What I was saying was that some people seem to have trouble (or dislike) thinking propositionally; I’ve had conversations where I’m proposing an argument and the other person seems to think that I just want to be cheered up or something, and doesn’t realize I actually want to discuss the substance.
I’ll second a part of your post: some people simply don’t like debate (either debate in person or argumentative writing.) Communicating with a non-debater about a debate-like topic is very strange; it’s like trying to fence against an opponent wielding a bowl of Jello instead of a sword.
Speaking as a fencer, I’m having a very hard time imagining what this would actually be like.
Really? I’m not a fencer, but I just imagine a fencer standing in a kitchen while the non-debater pulls out the Jello from the fridge. The fencer stands there confused for a bit, while the non-debater goes on their way, but eventually realizes ze’s a fencer for a reason! The fencer lunges, misses, and hits the bowl, breaking it and spilling the Jello. The non-debater then either gets angry and annoyed, or sighs, pulls out another bowl and begins to make a second batch of Jello.
I think if you drop the view that door-to-door evangelism is about conversation and see it mostly as a retention mechanism instead, it addresses some of the issues in the article.
The statistics on the number of people converted to a religion because someone came to their door and argued with them are pitiful. If you spend two years trying to convert people that way, the median expectation is “no conversions” (one or two is substantial success, and more than that is a crazy outlier). If your goal is actually to change people’s minds your approach has to be much more subtle. It requires building a relationship of trust, respect, and care for a long time before a normal person will take your words as worthwhile evidence for updating their life philosophy.
As near as I can tell, this is pretty much the correct strategy for honesty convincing someone to adopt the practices suggested by a belief, whether its a matter of Veganism or Mormonism. My understanding from reading about the sociology of evangelism is that the Mormons give pamphlets to the laity that tell them not to proselyze to friends, co-workers, neighbors etc. Their job is to be friendly, helpful, and admirable while deflecting questions about their beliefs as “helpful for me but not that important” until someone is so impressed by the quality of their lives and how nice they are at a BBQ that they ask two or three times about their beliefs. Then the family invites the interested party (usually an entire family, with bonds of friendship that run husband-to-husband, wife-to-wife, kids-to-kids) to a church social event to help them see how things work. If the family is still interested, the believing family doesn’t “close the sale” itself—they invite the interested family over to talk with a specialized member of the church who can help the interested family formally articulate their by-now-pre-existing interest in joining the church.
In the meantime, from what I can tell, the real adaptive function of going door to door is that it exposes a new adult believer to a lot of debate-style conversation about the idea they are promoting. In the absence of effective epistemic warnings to the contrary they become more committed to their idea via massive commitment and consistency, and various other cognitive biases. This makes them less likely to feel that the belief is just a matter of lip service, which would make a lapse in belief more likely in the absence of the social processes of regular church attendance.
One of my pet hypotheses for “where rationalists come from” is that many of us engage in debate-style conversations (similar to proselytizing) for fun when we are young in the absence of any coherent doctrine to defend that has been handed down to us by an institution. This practice teaches us respect for the usefulness of defending “the right content” as a helpful technique for the intrinsically valuable outcome of winning a debate. Then epistemic rationality is intelligible as a truth-biased (and hence nominally moral) method to find the content and justifications that will be useful for winning future debates… then you have enough to bootstrap into logical fallacies and cost-benefit-analysis and so on. This isn’t a very admirable theory, but if it was true then it seems helpful to know if you’re trying to teach or find rationalists.
Also, if this hypothesis is on the mark it probably has sociological implications: If someone gets an emotional boost from winning a debate then someone else is going to lose a debate and get negative reinforcement for the same verbal interactions. Unless this process was engineered somehow (with people losing on purpose and not taking it hard?) I suspect that rationalists produced by this method can never be more than a fraction of the population, and they will necessarily be intermixed with people for whom debate has almost entirely negative emotional associations.
From this (admittedly very weird) perspective, evangelists for false doctrines may unwittingly be performing an epistemic public service that, on net, raises the sanity waterline :-)
That’s useful information (for the cultishness discussion thread). Do you have a cite to hand?
I must admit, I’d assumed this method had some success, enough to bother. So I’m glad we don’t have to fear rationalists going door to door waking people too early on Saturday morning and saying “Say, friend, have you read the Sequences?”
Searching for confirmation on google I found different statistics quoted in an article about relative religious growth rates here. The source may have incentives to inflate their numbers but they claim:
I don’t remember where I read the numbers I gave above, but the first place I’d look would be The Future of Religion: Secularization, Revival and Cult Formation which is a generally interesting read.
I think the book would be helpful for the “cultishness discussion” here but also for the general “rapture of the nerds” critique aimed at the singularity hypothesis when people first encounter it. One helpful thing about The Future Of Religion is to dissolve the confusion that comes up from the conjunction of particular supernatural theories and particular sociological processes.
Sloppy thinking can lead to accusations that some political movements or psycho-therapeutic fads are “cults” when there is no substantial religious doctrine, just a certain set of “human tribal instincts” being deployed in a manner that is characteristic of many voluntary social processes. The LaRouche Movement is an instructive edge case, which is sometimes accused of being a “political cult”.
Any conscious/rational human improvement system probably will have to make use of some of these basic mechanisms if it is to be effective, and the question (as always) is simply whether they are being used for good ends, given the actual state of the world.
Another useful item for the cultishness discussion might be the Bonewits’ Cult Danger Evaluation Frame because it encodes a useful set of symptoms that predict whether something is going seriously wrong with a group… or at least signal that is is “playing with fluorine”. There are reasons to play with fluorine, but not many good reasons.
Those numbers and references are utterly wonderful, thank you!
Calling something with no substantial religious doctrine a “cult” is not a category error for the thing I’m talking about as the word “cult”. It’s not my private definition either. It’s a particular sociological phenomenon in a group. It’s the thing someone is worried about when they say “Has Bob joined some sort of cult?” I need to actually describe what I’m talking about, which I’m not sure how to unpack with the requisite accuracy. But LaRouche is right in there, and Amway is too for another one.
Now to relate that more specifically to predatory infectious memes. And dig up all the stuff I read ten or so years ago.
Yeah, I read into that stuff largely out of a curiosity about memetics too :-)
I like “the mom test” for that. If you’re hanging out with a group based around a common set of beliefs that are taken by the group to be formally true, would you be too embarrassed to ask your mom for advice about the group or the beliefs? If so, then for you (given your parents and their community and so on) it’s close enough to a cult that you really should stay away (unless you’re just doing participant observation as a research project). On the other hand, if you’re not too embarrassed, then you really should actually go ask her for advice, because that kind of practical and emotionally grounded feedback is good input for people to be mindful of, even if it isn’t always perfect.
This is a primary reason I recommend that people talk to one or both parents about SIAI if they start suspecting that they should get personally involved with existential risk activism and FAI and so on :-)
I’m personally not at all surprised that the success rates are so low. If the evangelists had actually internalized the idea that they’re doing it to save people from hell, they’d take it much more seriously. Instead, it’s generally framed as a matter of duty to the religious community, or a sign of personal virtue for making the effort at all. It doesn’t need to have a significant success rate to sustain itself, although I imagine that if it were inefficient enough that most door to door evangelists had never heard of a successful conversion, they might reevaluate their methods.
If you really wanted to convert others though, not merely to do the most that was comfortable, or convince yourself you had made an honest effort, I think that the mainstream Mormon approach would not be the most effective method. Or rather, it doesn’t take the approach far enough. Maybe I’m merely projecting an atypical attitude, but I think it would be far more effective to dedicate your life to moral causes. Give away everything you own, work your hands to the bone to give away more, try to set a standard that would make Ghandi look undercommitted. The number of people who will be impressed and interested in the beliefs of a mere upstanding community member is nothing compared to that which would be interested in a moral paragon.
Of course, one might argue that people will be driven off if they suspect that the religious beliefs demand too much of them, but while religious believers tend to claim status from the efforts of the most exemplary members of their faith, they rarely try to meet their standards. Also, I’m highly skeptical of any argument that states that the most effective approach conveniently intersects with what is most comfortable.
Of course, in this context, it’s easy to see how putting one’s beliefs into their proper perspective and fully internalizing them can be a tremendous disadvantage. It’s no wonder if most people interpret their religions to only demand as much of them as is convenient.
I’ll second a part of your post: some people simply don’t like debate (either debate in person or argumentative writing.) Communicating with a non-debater about a debate-like topic is very strange; it’s like trying to fence against an opponent wielding a bowl of Jello instead of a sword. LessWrong has a number of essays about the ways people debate irrationally (arguments as soldiers, dark side epistemology, etc.) but I think far more of the population just doesn’t debate at all, and will respond blankly if you try to debate them.
People without a debate mindset might think “Oh, I’m not very good at arguments.” People with a debate mindset, I believe, don’t ever think in those terms: if we believe something strongly, it’s because we think we have a good argument for it, and if we don’t have a good argument for it, then we don’t have particularly strong convictions about it. But a non-debater can both believe something deeply and believe she could be out-argued by the opposing view.
For what it’s worth: I consider myself a “debater” in the sense you mean, but there are plenty of things where I believe them, I feel strongly about them, and I believe I could be out-argued by a sufficiently clever articulation of an opposing view, even if that view was wrong.
Out-argued in what sense? Do you think that you wouldn’t be able to see why their arguments were wrong, or just that you wouldn’t be able to persuade an impartial audience that you were more right?
On subjects that I hold strong beliefs in, I anticipate that I could not be out-argued in the first sense. If someone was able to offer such arguments, I would either have to conclude that they were right, or that I didn’t understand the topic as well as I thought I did in the first place, and would have to revise the strength of my beliefs.
I’m not certain that the dividing line between those two senses is as crisp as you make it sound, but I guess I mean something like the latter sense. That is, I can imagine someone articulating their arguments for -P in such a way that their arguments are more compelling than mine are for P, even when P is true.
The dividing line comes from the fact that an impartial audience is not at all the same thing as a rational audience, and there’s a lot more to rhetoric than making arguments that are logically sound and tenable.
My general defence against this is to be too difficult to actually convince. I nod and smile and acknowledge the quality of the arguments but am not actually convinced to change my mind. I may well have taken this too far. (It certainly frustrates the heck out of people.) It’s useful if you know you’re fond enough of new ideas to be susceptible to neophilia-induced bad ideas. It’s somewhat like being just plain dim.
Ah. It sounds like we have different interpretations of what SarahC meant by out-argued.
I don’t believe a clever debater can long-term convince me of the falsehood of something I believe and feel strongly about (sadly, even if it’s true), although they might induce me to go along temporarily.
This is, incidentally, not to say that I can’t be caught up in cultishness, merely to say that clever arguments are sufficient (or, sadly, necessary) to do it. (ETA: er… I meant, of course, “are not sufficient,” which was perhaps clear)
I think there is a difference between (1) people who enjoy verbally competing to be correct, (2) people who enjoy such competition and have formally trained, studied, and practiced it, and (3) people who have trained, studied, practiced and come out disillusioned with strategic communication. When I was 10 I was the first kind of debater. I think many law school students, by stereotype, are the second kind. Having spent two years in college on a policy debate team and later coached a high school debate team I find myself feeling kinship with the third group.
I think formal competitive debate experience plus philosophy can help to calibrate people in very useful ways. Of a proposition Q you can ask what P(Q|H) is for various values of H if you’re sticking to pure Bayesian rationality. You can also have a sense of how difficult it would be to go aff (or neg) on Q (or not-Q) in front of different audiences.
Among skilled debaters where both sides have a research library and knowledge of Q in advance and skilled debaters are the audience, whoever goes aff and controls the content of Q should generally win.
But a good debater should also be able to take a large number of “open questions” as Q, and win in all four scenarios (aff/neg x Q/not-Q) with an arbitrary audience against an unskilled opponent. Watching this second thing happen, and learning to do it, and teaching other people to do it has given me relatively little respect for casual debates as a truth seeking process, but a lot of respect for formal debates as an educational process.
If anyone reading this is picking colleges, I recommend looking for one with a CEDA program and spending a year on the team :-)
I agree that casual debate isn’t so great as a truth seeking process.
What I was saying was that some people seem to have trouble (or dislike) thinking propositionally; I’ve had conversations where I’m proposing an argument and the other person seems to think that I just want to be cheered up or something, and doesn’t realize I actually want to discuss the substance.
Speaking as a fencer, I’m having a very hard time imagining what this would actually be like.
Really? I’m not a fencer, but I just imagine a fencer standing in a kitchen while the non-debater pulls out the Jello from the fridge. The fencer stands there confused for a bit, while the non-debater goes on their way, but eventually realizes ze’s a fencer for a reason! The fencer lunges, misses, and hits the bowl, breaking it and spilling the Jello. The non-debater then either gets angry and annoyed, or sighs, pulls out another bowl and begins to make a second batch of Jello.
Did that help? :)
Not much, no.
I knew there was a reason I like to hide a glock in my Jello.
This explains some more of the Southpark Episode about the mormons. It is actually pretty correct and good.