The fact that utility and probability can be transformed while maintaing the same decisions matches what the algo feels like from the inside. When thinking about actions, I often just feel like a potential action is “bad”, and it takes effort to piece out if I don’t think the outcome is super valuable, or if there’s a good outcome that I don’t think is likely.
You can have things called “beliefs” which are of type action. “Having” this belief is actually your decision to take certain actions in certain scenarios. You can also have things called “beliefs” which are of type probability, and are part your deep felt sense of what is and isn’t likely/true.
A belief-action that has a high EV (and feels “good”) will probably feel the same as a belief-probability that is close to 1.
Take a given sentence/proposition. You can put a high EV on the belief-action version of that sentence (mayhaps it has important consequences for your social groups) while putting a low probability on the belief-probability version of the sentence.
Meta Thoughts: The above idea is not fundamentally different from belief in belief or crony beliefs, both of which I’ve read a year or more ago. What I just wrote felt like a genuine insight. What do I think I understand now that I don’t think I understood then?
I think that recently (past two months, since CFAR) I’ve had better luck with going into “Super-truth” mode, looking into my own soul and asking, “Do you actually belief this?”
Now, I’ve got many more data points of, “Here’s a thing that I totally thought that I believed(probability) but actually I believed(action).”
Maybe the insight is that it’s easy to get mixed up between belief-prob and belief-action because the felt sense of probability and EV are very very similar, and genuinely non-trivial to peel apart.
^yeah, that feels like it. I think previously I thought, “Oh cool, now that I know that belief-action and belief-prob are different things, I just won’t do belief-action”. Now, I believe that you need to teach yourself to feel the difference between them, otherwise you will continue to mistake belief-actions for belief-probs.
Meta-Meta-Thought: The meta-thoughts was super useful to do, and I think I’ll do it more often, given that I often have a sense of, “Hmmmm, isn’t this basically [insert post in The Sequences here] re-phrased?”
The fact that utility and probability can be transformed while maintaing the same decisions matches what the algo feels like from the inside. When thinking about actions, I often just feel like a potential action is “bad”, and it takes effort to piece out if I don’t think the outcome is super valuable, or if there’s a good outcome that I don’t think is likely.
Thinking about belief in belief.
You can have things called “beliefs” which are of type action. “Having” this belief is actually your decision to take certain actions in certain scenarios. You can also have things called “beliefs” which are of type probability, and are part your deep felt sense of what is and isn’t likely/true.
A belief-action that has a high EV (and feels “good”) will probably feel the same as a belief-probability that is close to 1.
Take a given sentence/proposition. You can put a high EV on the belief-action version of that sentence (mayhaps it has important consequences for your social groups) while putting a low probability on the belief-probability version of the sentence.
Meta Thoughts: The above idea is not fundamentally different from belief in belief or crony beliefs, both of which I’ve read a year or more ago. What I just wrote felt like a genuine insight. What do I think I understand now that I don’t think I understood then?
I think that recently (past two months, since CFAR) I’ve had better luck with going into “Super-truth” mode, looking into my own soul and asking, “Do you actually belief this?”
Now, I’ve got many more data points of, “Here’s a thing that I totally thought that I believed(probability) but actually I believed(action).”
Maybe the insight is that it’s easy to get mixed up between belief-prob and belief-action because the felt sense of probability and EV are very very similar, and genuinely non-trivial to peel apart.
^yeah, that feels like it. I think previously I thought, “Oh cool, now that I know that belief-action and belief-prob are different things, I just won’t do belief-action”. Now, I believe that you need to teach yourself to feel the difference between them, otherwise you will continue to mistake belief-actions for belief-probs.
Meta-Meta-Thought: The meta-thoughts was super useful to do, and I think I’ll do it more often, given that I often have a sense of, “Hmmmm, isn’t this basically [insert post in The Sequences here] re-phrased?”