You talk about belief the way popular culture talks about love: as some kind of external influence that overcomes your resistance.
And belief can be like that, sure. But belief can also be the result of doing the necessary work.
I realize that’s an uncomfortable idea. But it’s also an important one.
Relatedly, my own thoughts on the value of truth: when the environment is very forgiving and even suboptimal choices mostly work out to my benefit, the cost of being incorrect a lot is mostly opportunity cost. That is, things go OK, and even get better sometimes. (Not as much better as they would have gotten had I optimized more, but still: better.)
I’ve spent most of my life in a forgiving environment, which makes it very easy to adopt the attitude that having accurate beliefs isn’t particularly important. I can go through life giving up lots of opportunities, and if I just don’t think too much about the improvements I’m giving up I’ll still be relatively content. It’s emotionally easy to discount possible future benefits.
Even if I do have transient moments of awareness of how much better it can be, I can suppress them by thinking about all the ways it can be worse and how much safer I am right where I am, as though refusing to climb somehow protected me from falling.
The thing is: when the environment is risky and most things cost me, the cost of being incorrect is loss. That is, things don’t go OK, and they get worse. And I can’t control the environment.
It’s emotionally harder to discount possible future losses.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.
You talk about belief the way popular culture talks about love: as some kind of external influence that overcomes your resistance.
And belief can be like that, sure. But belief can also be the result of doing the necessary work.
I realize that’s an uncomfortable idea. But it’s also an important one.
Relatedly, my own thoughts on the value of truth: when the environment is very forgiving and even suboptimal choices mostly work out to my benefit, the cost of being incorrect a lot is mostly opportunity cost. That is, things go OK, and even get better sometimes. (Not as much better as they would have gotten had I optimized more, but still: better.)
I’ve spent most of my life in a forgiving environment, which makes it very easy to adopt the attitude that having accurate beliefs isn’t particularly important. I can go through life giving up lots of opportunities, and if I just don’t think too much about the improvements I’m giving up I’ll still be relatively content. It’s emotionally easy to discount possible future benefits.
Even if I do have transient moments of awareness of how much better it can be, I can suppress them by thinking about all the ways it can be worse and how much safer I am right where I am, as though refusing to climb somehow protected me from falling.
The thing is: when the environment is risky and most things cost me, the cost of being incorrect is loss. That is, things don’t go OK, and they get worse. And I can’t control the environment.
It’s emotionally harder to discount possible future losses.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
(nods) That’s been my experience.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.