If I had been talking to the person you were talking to, I might have said something like this:
Why are you deceiving yourself into believing Orthodox Judaism as opposed to something else? If you, in fact, are deriving a benefit from deceiving yourself, while at the same time being aware that you are deceiving yourself, then why haven’t you optimized your deceptions into something other than an off-the-shelf religion by now? Have you ever really asked yourself the question: “What is the set of things that I would derive the most benefit from falsely believing?” Now if you really think you can make your life better by deceiving yourself, and you haven’t really thought carefully about what the exact set of things about which you would be better off deceiving yourself is, then it would seem unlikely that you’ve actually got the optimal set of self-deceptions in your brain. In particular, this means that it’s probably a bad idea to deceive yourself into thinking that your present set of self deceptions is optimal, so please don’t do that.
OK, now do you agree that finding the optimal set of self deceptions is a good idea? OK, good, but I have to give you one very important warning. If you actually want to have the optimal set of self deceptions, you’d better not deceive yourself at all while you are constructing this set of self deceptions, or you’ll probably get it wrong, because if, for example, you are currently sub-optimally deceiving yourself into believing that it is good to believe X, then you may end up deceiving yourself into actually believing X, even if that’s a bad idea. So don’t self deceive while you’re trying to figure out what to deceive yourself of.
Therefore, to the extent that you are in control of your self deceptions, (which you do seem to be) the first step toward getting the best set of self deceptions is to disable them all and begin a process of sincere inquiry as to what beliefs it is a good idea to have.
And hopefully, at the end of the process of sincere inquiry, they discover the best set of self deceptions happens to be empty. And if they don’t, if they actually thought it through with the highest epistemic standards, and even considered epistemic arguments such as honesty being one’s last defence, slashed tires, and all that.… Well, I’d be pretty surprised, but if I were actually shown that argument, and it actually did conform to the highest epistemic standards.… Maybe, provided it’s more likely that the argument was actually that good, as opposed to my just being deceived, I’d even concede.
Disclaimer: I don’t actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry. Regardless, if this sort of thought got stuck in their head, it could at least increase their cognitive dissonance, which might be a step on the road to recovery.
“Disclaimer: I don’t actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry.”
well exactly… If the person were thinking rationally enough to contemplate that argument, they really wouldn’t need it.
I have never successfully converted a religious person to atheism, but my ex-girlfriend did. I am a more rational person than her, I know more philosophy, I have earnestly tried many times, she just did this once, etc. How did she do it? The person in question was male and his religion forbade him from sex outside marriage. Most people are mostly ruled by their emotions.
well exactly… If the person were thinking rationally enough to contemplate that argument, they really wouldn’t need it.
My working model of this person was that the person has rehearsed emotional and argumentative defenses to protect their belief, or belief in belief, and that the person had the ability to be reasonably rational in other domains where they weren’t trying to be irrational. It therefore seemed to me that one strategy (while still dicey) to attempt to unconvince such a person would be to come up with an argument which is both:
Solid (Fooling/manipulating them into thinking the truth is bad cognitive citizenship, and won’t work anyway because their defenses will find the weakness in the argument.)
Not the same shape as the argument their defenses are expecting.
Roko: How is your working model of the person different from mine?
My working model of a religious person such as the above is that they assess any argument first and foremost on the basis “will accepting this argument cause me to have to abandon my religious belief?”. If yes, execute “search for least implausible counterargument”.
As such, no rational argument whose conclusion obviously leads to the abandonment of religion will work. However, rational arguments that can be accepted on the spot without obviously threatening religion, and which lead via hard-to-predict emotional channels to the weakening and defeat of that belief might work. It is my suspicion that persuading someone to change their mind on a really important issue almost always works like this.
“she just did this once, etc. How did she do it? ”
By appealing to a non-rational or irrational argument that would lead the person to adopt rationality.
Arguing rationally with a person who isn’t rational that they should take up the process is a waste of time. If it would work, it wouldn’t be necessary. It’s easy to say what course should be taken with a rational person, because rational thought is all alike. Irrational thought patterns can be nearly anything, so there’s no way to specify an argument that will convince everyone. You’d need to construct an argument that each person is specifically vulnerable to.
The problem is that you often don’t know until you actually start arguing with them that they are irrational or just confused and misled.
George H Smith has a pretty good essay about arguing with people to convert them to rationality, ” Atheism and the Virtue of Reasonableness”. For example, he advocates the “Presumption of Rationality”—you should always presume your adversary is rational until he demostrates otherwise. I don’t know if the essay is on-line or not, I read it as the second chapter of “Atheism, Ayn Rand, and Other Heresies.”
Irrational thought patterns can be nearly anything, so there’s no way to specify an argument that will convince everyone. You’d need to construct an argument that each person is specifically vulnerable to.
Irrational thought patterns can be nearly anything, but of course they strongly tend to form around standard human cognitive biases. This saves a great deal of time.
I would expect a reply along the lines of:
It’s precisely because I can’t trust my own reasoning when deciding which false beliefs I should have that I accept these which are handed down. I pick Judaism because it’s the oldest and thus has shown through memetic competition that it’s the strongest set of false beliefs one could have.
Or …” I pick Christianity because it’s the most popular and has therefore proven itself memetically competitive.”
I have a lot of friends who think “it’s old therefore it must be good to have survived this long” about Tarot and eastern religions etc.
Personally I’d wanna eliminate the false beliefs even if it cost me my mojo, but that’s a different set of priorities I guess.
I have a lot of friends who think “it’s old therefore it must be good to have survived this long” about Tarot and eastern religions etc.
In fact, the argument from tradition is considered very strong in alternative medicine in particular and New Age culture in general, even if whatever it is was actually made up^W^Wrediscovered last week.
If I had been talking to the person you were talking to, I might have said something like this:
Why are you deceiving yourself into believing Orthodox Judaism as opposed to something else? If you, in fact, are deriving a benefit from deceiving yourself, while at the same time being aware that you are deceiving yourself, then why haven’t you optimized your deceptions into something other than an off-the-shelf religion by now? Have you ever really asked yourself the question: “What is the set of things that I would derive the most benefit from falsely believing?” Now if you really think you can make your life better by deceiving yourself, and you haven’t really thought carefully about what the exact set of things about which you would be better off deceiving yourself is, then it would seem unlikely that you’ve actually got the optimal set of self-deceptions in your brain. In particular, this means that it’s probably a bad idea to deceive yourself into thinking that your present set of self deceptions is optimal, so please don’t do that.
OK, now do you agree that finding the optimal set of self deceptions is a good idea? OK, good, but I have to give you one very important warning. If you actually want to have the optimal set of self deceptions, you’d better not deceive yourself at all while you are constructing this set of self deceptions, or you’ll probably get it wrong, because if, for example, you are currently sub-optimally deceiving yourself into believing that it is good to believe X, then you may end up deceiving yourself into actually believing X, even if that’s a bad idea. So don’t self deceive while you’re trying to figure out what to deceive yourself of.
Therefore, to the extent that you are in control of your self deceptions, (which you do seem to be) the first step toward getting the best set of self deceptions is to disable them all and begin a process of sincere inquiry as to what beliefs it is a good idea to have.
And hopefully, at the end of the process of sincere inquiry, they discover the best set of self deceptions happens to be empty. And if they don’t, if they actually thought it through with the highest epistemic standards, and even considered epistemic arguments such as honesty being one’s last defence, slashed tires, and all that.… Well, I’d be pretty surprised, but if I were actually shown that argument, and it actually did conform to the highest epistemic standards.… Maybe, provided it’s more likely that the argument was actually that good, as opposed to my just being deceived, I’d even concede.
Disclaimer: I don’t actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry. Regardless, if this sort of thought got stuck in their head, it could at least increase their cognitive dissonance, which might be a step on the road to recovery.
“Disclaimer: I don’t actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry.”
well exactly… If the person were thinking rationally enough to contemplate that argument, they really wouldn’t need it.
I have never successfully converted a religious person to atheism, but my ex-girlfriend did. I am a more rational person than her, I know more philosophy, I have earnestly tried many times, she just did this once, etc. How did she do it? The person in question was male and his religion forbade him from sex outside marriage. Most people are mostly ruled by their emotions.
My working model of this person was that the person has rehearsed emotional and argumentative defenses to protect their belief, or belief in belief, and that the person had the ability to be reasonably rational in other domains where they weren’t trying to be irrational. It therefore seemed to me that one strategy (while still dicey) to attempt to unconvince such a person would be to come up with an argument which is both:
Solid (Fooling/manipulating them into thinking the truth is bad cognitive citizenship, and won’t work anyway because their defenses will find the weakness in the argument.)
Not the same shape as the argument their defenses are expecting.
Roko: How is your working model of the person different from mine?
My working model of a religious person such as the above is that they assess any argument first and foremost on the basis “will accepting this argument cause me to have to abandon my religious belief?”. If yes, execute “search for least implausible counterargument”.
As such, no rational argument whose conclusion obviously leads to the abandonment of religion will work. However, rational arguments that can be accepted on the spot without obviously threatening religion, and which lead via hard-to-predict emotional channels to the weakening and defeat of that belief might work. It is my suspicion that persuading someone to change their mind on a really important issue almost always works like this.
“she just did this once, etc. How did she do it? ”
By appealing to a non-rational or irrational argument that would lead the person to adopt rationality.
Arguing rationally with a person who isn’t rational that they should take up the process is a waste of time. If it would work, it wouldn’t be necessary. It’s easy to say what course should be taken with a rational person, because rational thought is all alike. Irrational thought patterns can be nearly anything, so there’s no way to specify an argument that will convince everyone. You’d need to construct an argument that each person is specifically vulnerable to.
The problem is that you often don’t know until you actually start arguing with them that they are irrational or just confused and misled.
George H Smith has a pretty good essay about arguing with people to convert them to rationality, ” Atheism and the Virtue of Reasonableness”. For example, he advocates the “Presumption of Rationality”—you should always presume your adversary is rational until he demostrates otherwise. I don’t know if the essay is on-line or not, I read it as the second chapter of “Atheism, Ayn Rand, and Other Heresies.”
Irrational thought patterns can be nearly anything, but of course they strongly tend to form around standard human cognitive biases. This saves a great deal of time.
“Most people are mostly ruled by their emotions.”
To be more specific, most men, for a considerable portion of their lives, are mostly ruled by their sex drives.
To be clear, she never did say, “I am deceiving myself” or “I falsely believe that there is a God”.
I stand corrected. I hereby strike the first two sentences.
I would expect a reply along the lines of: It’s precisely because I can’t trust my own reasoning when deciding which false beliefs I should have that I accept these which are handed down. I pick Judaism because it’s the oldest and thus has shown through memetic competition that it’s the strongest set of false beliefs one could have.
Or …” I pick Christianity because it’s the most popular and has therefore proven itself memetically competitive.”
I have a lot of friends who think “it’s old therefore it must be good to have survived this long” about Tarot and eastern religions etc.
Personally I’d wanna eliminate the false beliefs even if it cost me my mojo, but that’s a different set of priorities I guess.
In fact, the argument from tradition is considered very strong in alternative medicine in particular and New Age culture in general, even if whatever it is was actually made up^W^Wrediscovered last week.