Suppose I believe strongly that violent crime rates are soaring in my country (Canada), largely because I hear people talking about “crime being on the rise” all the time, and because I hear about murders on the news. I did not reason myself into this position, in other words. Then you show me some statistics, and I change my mind.
The chance of this working depends greatly on how significant the contested fact is to your identity. You may be willing to believe abstractly that crime rates are down and public safety is up after being shown statistics to that effect—but I predict that (for example) a parent who’d previously been worried about child abductions after hearing several highly publicized news stories, and who’d already adopted and vigorously defended childrearing policies consistent with this fear, would be much less likely to update their policies after seeing an analogous set of statistics.
This is partly because we have to internalize a lot of things in our youth and we can’t afford to vet everything our parents/friends/culture say to us. But the epistemic justification for the starting opinions may be terrible, and yet that doesn’t mean we’re incapable of having our minds changed.
I agree, but I think part of the process of having your mind changed is the understanding that you came to believe those internalized things in a haphazard way. And you might be resisting that understanding because of the reasons @Nornagest mentions—you’ve invested into them or incorporated them into your identity, for example. I think I’m more inclined to change the quote to
You can’t expect to reason someone out of a position they didn’t reason themselves into.
to make it slightly more useful in practice, because often changing the person’s mind will require not only knowing the more accurate facts or proper reasoning, but also knowing why the person is attached to his old position—and people generally don’t reveal that until they’re ready to change their mind on their own.
Oops, I guess I wasn’t sure where to put this comment.
The chance of this working depends greatly on how significant the contested fact is to your identity. You may be willing to believe abstractly that crime rates are down and public safety is up after being shown statistics to that effect—but I predict that (for example) a parent who’d previously been worried about child abductions after hearing several highly publicized news stories, and who’d already adopted and vigorously defended childrearing policies consistent with this fear, would be much less likely to update their policies after seeing an analogous set of statistics.
I agree, but I think part of the process of having your mind changed is the understanding that you came to believe those internalized things in a haphazard way. And you might be resisting that understanding because of the reasons @Nornagest mentions—you’ve invested into them or incorporated them into your identity, for example. I think I’m more inclined to change the quote to
to make it slightly more useful in practice, because often changing the person’s mind will require not only knowing the more accurate facts or proper reasoning, but also knowing why the person is attached to his old position—and people generally don’t reveal that until they’re ready to change their mind on their own.
Oops, I guess I wasn’t sure where to put this comment.