I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
(nods) That’s been my experience.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.