Apologies in advance for the long response. Hopefully this will be worth the read.
I greatly appreciate your post because it challenges some of my own beliefs and made me reassess them. I agree that a person can get by in this world with bad epistemological hygiene. However, humans are animals that evolved to adapt behaviors for many environments. Getting by is cheap. The problem with poor epistemological hygiene (EH) isn’t that a person can’t get by. As I see it, there are three issues:
The worse your EH is, the more your success is based on luck. If you’re right, it’s by accident, or you learn the hard way.
Bad EH means bad predictions and therefore bad future-proofing. The world changes, and people are often clumsy to adapt, if they adapt at all.
Though individual humans can survive, poor EH across all humanity leads to poor decisions that are made collectively but not deliberately (e.g. the tragedy of the commons, or cycles of political revolution), which hurt us all in ways that are hard to measure, either because they are gradual or because they require comparing to a counterfactual state of the world.
Most animal populations can survive with trial and error, natural selection, and no ability to destroy the world. I would prefer the standards for humanity to be higher. Thoughts?
Anecdote follows:
Coincidentally, I had a conversation at work today that culminated in the concept you describe as “less-examined beliefs that don’t have to pay rent for you to happily contribute in the way you like”. The people with whom I was speaking were successful members of society, so they fell into the uncanny valley for me when they started pushing the idea that everyone has their own truth. I’m not sure if it’s better or worse that they didn’t quite literally believe that, but didn’t know how to better articulate what they actually believed.
Ultimately what I got them to agree to (I think) is that although everyone has irrefutable experiences, what they infer about the structure of the world from those experiences may be testably wrong. However, I personally will have no strong beliefs about the truth value of their hypothesis if I have too much conflicting evidence. However, I won’t want to put much effort into testing the hypothesis unless my plans depend on it being true or false. It’s murkier with normative beliefs, because when those become relevant, it’s because they conflict with each other in an irreconcilable way, and it’s much more difficult if not impossible to provide evidence that leads people to change their basic normative beliefs.
That said, I suspect that if we’re not making plans that falsify each other’s beliefs and conflict with each other’s sense of right and wrong, we’re probably stagnating as a civilization. That ties in with your idea of beliefs not being examined because they don’t have to be. The great problem is that people aren’t putting their beliefs in situations where they will either succeed or fail. To me, that’s the true spirit of science.
For example, my objection to people believing in poltergeists (which is how the conversation started) isn’t that they believe it. It’s that they don’t see the vast implications of a) transhumanism via ghost transformation, b) undetectable spies, c) remote projection of physical force, or d) possibly unlimited energy. They live as if none of those possibilities exist, which to me is a worse indictment of their beliefs than a lack of evidence, and an indictment of their education even if they’re right about the ghosts. If people traced the implications of their beliefs, they could act more consistently on them and more easily falsify them. I strongly suspect that cultivating this habit would yield benefits on the individual and population level.
I personally will have no strong beliefs about the truth value of their hypothesis if I have too much conflicting evidence. However, I won’t want to put much effort into testing the hypothesis unless my plans depend on it being true or false.
I like how you said this.
The people with whom I was speaking were successful members of society, so they fell into the uncanny valley for me when they started pushing the idea that everyone has their own truth. I’m not sure if it’s better or worse that they didn’t quite literally believe that, but didn’t know how to better articulate what they actually believed.
In social situations, I’ve been trying to find a delicate and concise way to get across that, “‘Everyone has their own truth’ is not an experience-constraining belief. Saying it is a marker of empathy—good for you (seriously!). But if I wanted to falsify that belief, I wouldn’t know where to begin. What trade-offs do you think you’re making by saying, ‘Everyone has their own truth’?”
“Everyone has their own truth” is just one example of these kinds of applause-lights-y nonbeliefs. I say them too when I’m trying to signal empathy, and not much else.
For example, my objection to people believing in poltergeists (which is how the conversation started) isn’t that they believe it. It’s that they don’t see the vast implications of a) transhumanism via ghost transformation, b) undetectable spies, c) remote projection of physical force, or d) possibly unlimited energy. They live as if none of those possibilities exist, which to me is a worse indictment of their beliefs than a lack of evidence, and an indictment of their education even if they’re right about the ghosts.
Because they live as if none of these possibilities exist (i.e. their experiences are constrained), couldn’t you say that for some definition of “believe,” they don’t actually believe in poltergeists? They’re committing a minor sin by saying out loud that they believe in poltergeists, while not living as though they do.
That said, I’d still say that aligning your stated beliefs with how you behave is admirable and effective.
Apologies in advance for the long response. Hopefully this will be worth the read.
I greatly appreciate your post because it challenges some of my own beliefs and made me reassess them. I agree that a person can get by in this world with bad epistemological hygiene. However, humans are animals that evolved to adapt behaviors for many environments. Getting by is cheap. The problem with poor epistemological hygiene (EH) isn’t that a person can’t get by. As I see it, there are three issues:
The worse your EH is, the more your success is based on luck. If you’re right, it’s by accident, or you learn the hard way.
Bad EH means bad predictions and therefore bad future-proofing. The world changes, and people are often clumsy to adapt, if they adapt at all.
Though individual humans can survive, poor EH across all humanity leads to poor decisions that are made collectively but not deliberately (e.g. the tragedy of the commons, or cycles of political revolution), which hurt us all in ways that are hard to measure, either because they are gradual or because they require comparing to a counterfactual state of the world.
Most animal populations can survive with trial and error, natural selection, and no ability to destroy the world. I would prefer the standards for humanity to be higher. Thoughts?
Anecdote follows:
Coincidentally, I had a conversation at work today that culminated in the concept you describe as “less-examined beliefs that don’t have to pay rent for you to happily contribute in the way you like”. The people with whom I was speaking were successful members of society, so they fell into the uncanny valley for me when they started pushing the idea that everyone has their own truth. I’m not sure if it’s better or worse that they didn’t quite literally believe that, but didn’t know how to better articulate what they actually believed.
Ultimately what I got them to agree to (I think) is that although everyone has irrefutable experiences, what they infer about the structure of the world from those experiences may be testably wrong. However, I personally will have no strong beliefs about the truth value of their hypothesis if I have too much conflicting evidence. However, I won’t want to put much effort into testing the hypothesis unless my plans depend on it being true or false. It’s murkier with normative beliefs, because when those become relevant, it’s because they conflict with each other in an irreconcilable way, and it’s much more difficult if not impossible to provide evidence that leads people to change their basic normative beliefs.
That said, I suspect that if we’re not making plans that falsify each other’s beliefs and conflict with each other’s sense of right and wrong, we’re probably stagnating as a civilization. That ties in with your idea of beliefs not being examined because they don’t have to be. The great problem is that people aren’t putting their beliefs in situations where they will either succeed or fail. To me, that’s the true spirit of science.
For example, my objection to people believing in poltergeists (which is how the conversation started) isn’t that they believe it. It’s that they don’t see the vast implications of a) transhumanism via ghost transformation, b) undetectable spies, c) remote projection of physical force, or d) possibly unlimited energy. They live as if none of those possibilities exist, which to me is a worse indictment of their beliefs than a lack of evidence, and an indictment of their education even if they’re right about the ghosts. If people traced the implications of their beliefs, they could act more consistently on them and more easily falsify them. I strongly suspect that cultivating this habit would yield benefits on the individual and population level.
I like how you said this.
In social situations, I’ve been trying to find a delicate and concise way to get across that, “‘Everyone has their own truth’ is not an experience-constraining belief. Saying it is a marker of empathy—good for you (seriously!). But if I wanted to falsify that belief, I wouldn’t know where to begin. What trade-offs do you think you’re making by saying, ‘Everyone has their own truth’?”
“Everyone has their own truth” is just one example of these kinds of applause-lights-y nonbeliefs. I say them too when I’m trying to signal empathy, and not much else.
Because they live as if none of these possibilities exist (i.e. their experiences are constrained), couldn’t you say that for some definition of “believe,” they don’t actually believe in poltergeists? They’re committing a minor sin by saying out loud that they believe in poltergeists, while not living as though they do.
That said, I’d still say that aligning your stated beliefs with how you behave is admirable and effective.