If someone doesn’t value evidence, what evidence are you going to provide that proves they should value evidence? If someone doesn’t value logic, what logical argument would you invoke to prove they should value logic?
You put them into a social enviroment where the high status people value logic and evidence. You give them the plausible promise that they can increase their status in that enviroment by increasing the amount that they value logic and evidence.
The subject’s capacity for deception is finite, and will be needed elsewhere. Sooner or later it becomes more cost-effective for the sincere belief to change.
I generally agree with your point. The problem with the specific application is that the subject’s capacity for thinking logically (especially if you want the logic to be correct) is even more limited.
If the subject is marginally capable of logical thought, the straightforward response is to try stupid random things until it becomes obvious that going along with what you want is the least exhausting option. Even fruit flies are capable of learning from personal experience.
In the event of total incapacity at logical thought… why are you going to all this trouble? What do you actually want?
If the subject is marginally capable of logical thought, the straightforward response is to try stupid random things until it becomes obvious that going along with what you want is the least exhausting option.
That depends on how much effort you’re willing to spend on each subject verifying that they’re not faking.
I think the most common human tactic for appearing to care is to lie to themselves about caring until they actually believe they care; once this is in place they keep up appearances by actually caring if anyone is looking, and if people look often enough this just becomes actually caring.
Maybe the idea could gain popularity from a survival-island type reality program in which contestants have to measure the height of trees without climbing them, calculate the diameter of the earth, or demonstrate the existence of electrons (in order of increasing difficulty).
You can’t reason someone out of a position they didn’t reason themselves into.
I don’t think this is empirically true, though. Suppose I believe strongly that violent crime rates are soaring in my country (Canada), largely because I hear people talking about “crime being on the rise” all the time, and because I hear about murders on the news. I did not reason myself into this position, in other words.
Then you show me some statistics, and I change my mind.
In general, I think a supermajority of our starting opinions (priors, essentially) are held for reasons that would not pass muster as ‘rational,’ even if we were being generous with that word. This is partly because we have to internalize a lot of things in our youth and we can’t afford to vet everything our parents/friends/culture say to us. But the epistemic justification for the starting opinions may be terrible, and yet that doesn’t mean we’re incapable of having our minds changed.
Suppose I believe strongly that violent crime rates are soaring in my country (Canada), largely because I hear people talking about “crime being on the rise” all the time, and because I hear about murders on the news. I did not reason myself into this position, in other words. Then you show me some statistics, and I change my mind.
The chance of this working depends greatly on how significant the contested fact is to your identity. You may be willing to believe abstractly that crime rates are down and public safety is up after being shown statistics to that effect—but I predict that (for example) a parent who’d previously been worried about child abductions after hearing several highly publicized news stories, and who’d already adopted and vigorously defended childrearing policies consistent with this fear, would be much less likely to update their policies after seeing an analogous set of statistics.
This is partly because we have to internalize a lot of things in our youth and we can’t afford to vet everything our parents/friends/culture say to us. But the epistemic justification for the starting opinions may be terrible, and yet that doesn’t mean we’re incapable of having our minds changed.
I agree, but I think part of the process of having your mind changed is the understanding that you came to believe those internalized things in a haphazard way. And you might be resisting that understanding because of the reasons @Nornagest mentions—you’ve invested into them or incorporated them into your identity, for example. I think I’m more inclined to change the quote to
You can’t expect to reason someone out of a position they didn’t reason themselves into.
to make it slightly more useful in practice, because often changing the person’s mind will require not only knowing the more accurate facts or proper reasoning, but also knowing why the person is attached to his old position—and people generally don’t reveal that until they’re ready to change their mind on their own.
Oops, I guess I wasn’t sure where to put this comment.
Suppose I believe strongly that violent crime rates are soaring in my country (Canada), largely because I hear people talking about “crime being on the rise” all the time, and because I hear about murders on the news. I did not reason myself into this position, in other words.
It looks to me like you arrived at this position via weighing the available evidence. In other words, you reasoned yourself into it.
Upon second reading I see you don’t have a base rate for the amount of violent crime on the news in peaceful countries, and you derived a high absolute level from a high[er than you’d like] rate of change. But you’ve shown a willingness to reason, even if you reasoned poorly (as poorly as me when I’m not careful. Scary!) So I think jooyus’ quote survives.
Put them in a situation where they need to use logic and evidence to understand their environment and where understanding their environment is crucial for their survival, and they’ll figure it out by themselves. No one really believes God will protect them from harm...
No one really believes God will protect them from harm...
I have some friends who do… At least insofar as things like “I don’t have to worry about finances because God is watching over me, so I won’t bother trying to keep a balanced budget.” Then again, being financially irresponsible (a behaviour I find extremely hard to understand and sympathize with) seems to be common-ish, and not just among people who think God will take care of their problems.
Moreover, it often involves a great deal of stress. Small wonder that many people try to avoid that stress by just not thinking about how they spend money.
Well… as something completely and obviously deterministic (the amount of money you have at the end of the month is the amount you had at the beginning of the month, plus the amount you’ve earned, minus the amount you’ve spent, for a sufficiently broad definition of “earn” and “spend”), that’s about the last situation in which I’d expect people to rely on God. With stuff which is largely affected by factors you cannot control directly (e.g. your health) I would be much less surprised.
Once you have those figures, it is deterministic; however, at the start of the month, those figures are not yet determined. One might win a small prize in a lottery; the price of some staple might unexpectedly increase or decrease; an aunt may or may not send an expensive gift; a minor traffic accident may or may not happen, requiring immediate expensive repairs.
So there are factors that you cannot control that affect your finances.
...that’s about the last situation in which I’d expect people to rely on God
Does this cause you to doubt the veracity of the claim in the parent, or to update towards your model of what people rely on God for being wrong? I guess it should probably be both, to some extent. It’s just not really clear from your post which you’re doing.
With stuff which is largely affected by factors you cannot control directly (e.g. your health) I would be much less surprised.
“Praying for healing” was quite a common occurrence at my friend’s church. I didn’t pick that as an example because’s it’s a lot less straightforward. Praying for healing probably does appear to help sometimes (placebo effect), and it’s hard enough for people who don’t believe in God to be rational about health–there aren’t just factor you cannot control, there are plenty of factors we don’t understand.
There hasn’t been a lot of money spent researching it, but meta-analysis of the studies that have been conducted show that on average there is no placebo effect.
Sadly, that only works on a natural-selection basis, so the ethics boards forbid us from doing this. If they never see anyone actually failing to survive, they won’t change their behavior.
If you threaten someone in their survival they are likely to get emotional. That’s not the best mental state to apply logic.
Suicide bombers don’t suddenly start believing in reason just before they are send out to kill themselves.
Soldiers in trenches who fear for their lives on the other hand do often start to pray. Maybe there are a few atheists in foxholes, but that state seems to promote religiousness.
Soldiers in trenches who fear for their lives on the other hand do often start to pray. Maybe there are a few atheists in foxholes, but that state seems to promote religiousness.
Does it promote religiousness or attract the religious?
You usually can’t get someone with a spider phobia to drop his phobia by trying to convince them with logic or evidence. On the other hand there are psychological strategies to help them to get rid of the phobia.
I think cognitive behavioural therapy for phobias, which seems to work pretty well in a large number of cases, actually relies on helping people see that their fear is irrational.
As someone with a phobia, I can tell you from experience that realizing your fear is irrational doesn’t actually make the fear go away. Sometimes it even makes you feel more guilty for having it in the first place. Realizing it’s irrational just helps you develop coping strategies for acting normal when you’re freaking out in public.
Oh sure, I can definitely believe that. Maybe a better choice of wording above would have been “internalise” rather than “see”, which would rather negate my point, I guess. Or maybe it works differently for some people. I don’t have any experience with phobias or CBT myself.
It’s alief vs. belief. It’s one thing to see that, in theory, almost all spiders are harmless. It’s another to remain calm in the presence of a spider if you’ve had a history of being terrified of them.
Desensitization is a process of teaching a person how to calm themselves, and then exposing them to things which are just a little like spiders (a picture of a cartoon spider, perhaps, or the word spider). When they can calm themselves around that, they’re exposed to something a little more like a spider, and learn to be calm around that.
The alief system can learn, but it’s not necessarily a verbal process.
Even when it is verbal, as when someone learns to identify various sorts of irrational thoughts, it’s much slower than understanding an argument.
--Sam Harris
You put them into a social enviroment where the high status people value logic and evidence. You give them the plausible promise that they can increase their status in that enviroment by increasing the amount that they value logic and evidence.
How would this encourage them to actually value logic and evidence instead of just appearing to do so?
The subject’s capacity for deception is finite, and will be needed elsewhere. Sooner or later it becomes more cost-effective for the sincere belief to change.
That is breathtakingly both the most cynical and beautiful thing I have read all day :)
Postcynicism FTW!
I generally agree with your point. The problem with the specific application is that the subject’s capacity for thinking logically (especially if you want the logic to be correct) is even more limited.
If the subject is marginally capable of logical thought, the straightforward response is to try stupid random things until it becomes obvious that going along with what you want is the least exhausting option. Even fruit flies are capable of learning from personal experience.
In the event of total incapacity at logical thought… why are you going to all this trouble? What do you actually want?
That depends on how much effort you’re willing to spend on each subject verifying that they’re not faking.
People tend to conform to it’s peers values.
And for that matter, to start believing what they behave as if they believe.
It’s not a question of encouragement. Humans tends to want to be like the high status folk that they look up to.
Want to be like or appear to be like? I’m not convinced people can be relied on to make the distinction, much less choose the “correct” one.
Or do they want to be like those folks appear to be like?
I think the most common human tactic for appearing to care is to lie to themselves about caring until they actually believe they care; once this is in place they keep up appearances by actually caring if anyone is looking, and if people look often enough this just becomes actually caring.
Maybe the idea could gain popularity from a survival-island type reality program in which contestants have to measure the height of trees without climbing them, calculate the diameter of the earth, or demonstrate the existence of electrons (in order of increasing difficulty).
Couple of attempts:
The hard sciences
Professions with a professional code of ethics, and consequences for violating it.
This reminds me of
which I believe is a paraphrasing of something Jonathan Swift said, but I’m not sure. Anyone have the original?
I don’t think this is empirically true, though. Suppose I believe strongly that violent crime rates are soaring in my country (Canada), largely because I hear people talking about “crime being on the rise” all the time, and because I hear about murders on the news. I did not reason myself into this position, in other words.
Then you show me some statistics, and I change my mind.
In general, I think a supermajority of our starting opinions (priors, essentially) are held for reasons that would not pass muster as ‘rational,’ even if we were being generous with that word. This is partly because we have to internalize a lot of things in our youth and we can’t afford to vet everything our parents/friends/culture say to us. But the epistemic justification for the starting opinions may be terrible, and yet that doesn’t mean we’re incapable of having our minds changed.
The chance of this working depends greatly on how significant the contested fact is to your identity. You may be willing to believe abstractly that crime rates are down and public safety is up after being shown statistics to that effect—but I predict that (for example) a parent who’d previously been worried about child abductions after hearing several highly publicized news stories, and who’d already adopted and vigorously defended childrearing policies consistent with this fear, would be much less likely to update their policies after seeing an analogous set of statistics.
I agree, but I think part of the process of having your mind changed is the understanding that you came to believe those internalized things in a haphazard way. And you might be resisting that understanding because of the reasons @Nornagest mentions—you’ve invested into them or incorporated them into your identity, for example. I think I’m more inclined to change the quote to
to make it slightly more useful in practice, because often changing the person’s mind will require not only knowing the more accurate facts or proper reasoning, but also knowing why the person is attached to his old position—and people generally don’t reveal that until they’re ready to change their mind on their own.
Oops, I guess I wasn’t sure where to put this comment.
It looks to me like you arrived at this position via weighing the available evidence. In other words, you reasoned yourself into it. Upon second reading I see you don’t have a base rate for the amount of violent crime on the news in peaceful countries, and you derived a high absolute level from a high[er than you’d like] rate of change. But you’ve shown a willingness to reason, even if you reasoned poorly (as poorly as me when I’m not careful. Scary!) So I think jooyus’ quote survives.
If you can’t appeal to reason to make reason appealing, you appeal to emotion and authority to make reason appealing.
Put them in a situation where they need to use logic and evidence to understand their environment and where understanding their environment is crucial for their survival, and they’ll figure it out by themselves. No one really believes God will protect them from harm...
I have some friends who do… At least insofar as things like “I don’t have to worry about finances because God is watching over me, so I won’t bother trying to keep a balanced budget.” Then again, being financially irresponsible (a behaviour I find extremely hard to understand and sympathize with) seems to be common-ish, and not just among people who think God will take care of their problems.
Why not? Thinking about money is work. It involves numbers.
Moreover, it often involves a great deal of stress. Small wonder that many people try to avoid that stress by just not thinking about how they spend money.
Well… as something completely and obviously deterministic (the amount of money you have at the end of the month is the amount you had at the beginning of the month, plus the amount you’ve earned, minus the amount you’ve spent, for a sufficiently broad definition of “earn” and “spend”), that’s about the last situation in which I’d expect people to rely on God. With stuff which is largely affected by factors you cannot control directly (e.g. your health) I would be much less surprised.
Once you have those figures, it is deterministic; however, at the start of the month, those figures are not yet determined. One might win a small prize in a lottery; the price of some staple might unexpectedly increase or decrease; an aunt may or may not send an expensive gift; a minor traffic accident may or may not happen, requiring immediate expensive repairs.
So there are factors that you cannot control that affect your finances.
Does this cause you to doubt the veracity of the claim in the parent, or to update towards your model of what people rely on God for being wrong? I guess it should probably be both, to some extent. It’s just not really clear from your post which you’re doing.
Mostly the latter, as per Hanlon’s razor.
“Praying for healing” was quite a common occurrence at my friend’s church. I didn’t pick that as an example because’s it’s a lot less straightforward. Praying for healing probably does appear to help sometimes (placebo effect), and it’s hard enough for people who don’t believe in God to be rational about health–there aren’t just factor you cannot control, there are plenty of factors we don’t understand.
There hasn’t been a lot of money spent researching it, but meta-analysis of the studies that have been conducted show that on average there is no placebo effect.
That’s really interesting...I had not heard that. Thanks for the info!
I think that’s mostly because money is too abstract, and as long as you get by you don’t even realize what you’ve lost. Survival is much more real.
Sadly, that only works on a natural-selection basis, so the ethics boards forbid us from doing this. If they never see anyone actually failing to survive, they won’t change their behavior.
Can’t make an omelette without breaking some eggs. Videotape the whole thing so the next one has even more evidence.
If you threaten someone in their survival they are likely to get emotional. That’s not the best mental state to apply logic.
Suicide bombers don’t suddenly start believing in reason just before they are send out to kill themselves.
Soldiers in trenches who fear for their lives on the other hand do often start to pray. Maybe there are a few atheists in foxholes, but that state seems to promote religiousness.
Does it promote religiousness or attract the religious?
I think it just promotes grasping at straws.
Take all their stuff. Tell them that they have no evidence that it’s theirs and no logical arguments that they should be allowed to keep it.
They beat you up. People who haven’t specialized in logic and evidence have not therefore been idle.
Shoot them?
I think you just independently invented the holy war.
This is from the Sam Harris vs. William Lane Craig debate, starting around the 44 minute mark. IIRC, Luke’s old website has a review of this particular debate.
You can find out what persuades them and give them that.
And in some instances that would likely be what we call logic or evidence.
You usually can’t get someone with a spider phobia to drop his phobia by trying to convince them with logic or evidence. On the other hand there are psychological strategies to help them to get rid of the phobia.
I think cognitive behavioural therapy for phobias, which seems to work pretty well in a large number of cases, actually relies on helping people see that their fear is irrational.
As someone with a phobia, I can tell you from experience that realizing your fear is irrational doesn’t actually make the fear go away. Sometimes it even makes you feel more guilty for having it in the first place. Realizing it’s irrational just helps you develop coping strategies for acting normal when you’re freaking out in public.
Oh sure, I can definitely believe that. Maybe a better choice of wording above would have been “internalise” rather than “see”, which would rather negate my point, I guess. Or maybe it works differently for some people. I don’t have any experience with phobias or CBT myself.
It’s alief vs. belief. It’s one thing to see that, in theory, almost all spiders are harmless. It’s another to remain calm in the presence of a spider if you’ve had a history of being terrified of them.
Desensitization is a process of teaching a person how to calm themselves, and then exposing them to things which are just a little like spiders (a picture of a cartoon spider, perhaps, or the word spider). When they can calm themselves around that, they’re exposed to something a little more like a spider, and learn to be calm around that.
The alief system can learn, but it’s not necessarily a verbal process.
Even when it is verbal, as when someone learns to identify various sorts of irrational thoughts, it’s much slower than understanding an argument.
Right; that’s the “behavioural” part of cognitive behavioural therapy, right? But the “cognitive” part is an explicit, verbal process.