Curated. While the question of Epistemic Luck has been discussed on LessWrong before, I like this post for the entertaining and well-written examples that I expect to stick in my mind (the other one I remember from this topic is “Why are all functional decision-theorists in Berkeley and causal (?) decision theorists in Oxford).
It’s a scary topic. What I fear is that not only might you start with different priors from someone else, but that you also assign different likelihood ratio updates to evidence gathered, in a way that traps you the more so in your beliefs. Of course, one has the internal inside feeling of “rightness” about their own views...but so do people with opposing views.
The approach at the conclusion of this post doesn’t seem satisfactory to me, but I also don’t feel like I’m satisfied with any other approach. At best I recurse to a meta-level and say I trust my views because of the gears I have that say my epistemology is better.
Anyhow, these are some ramblings. I like this post for raising the topic again. It feels tricky, and it feels topical with the charged and high-stakes debates of 2024- that many members of community find themselves having.
Curated. While the question of Epistemic Luck has been discussed on LessWrong before, I like this post for the entertaining and well-written examples that I expect to stick in my mind (the other one I remember from this topic is “Why are all functional decision-theorists in Berkeley and causal (?) decision theorists in Oxford).
It’s a scary topic. What I fear is that not only might you start with different priors from someone else, but that you also assign different likelihood ratio updates to evidence gathered, in a way that traps you the more so in your beliefs. Of course, one has the internal inside feeling of “rightness” about their own views...but so do people with opposing views.
The approach at the conclusion of this post doesn’t seem satisfactory to me, but I also don’t feel like I’m satisfied with any other approach. At best I recurse to a meta-level and say I trust my views because of the gears I have that say my epistemology is better.
Anyhow, these are some ramblings. I like this post for raising the topic again. It feels tricky, and it feels topical with the charged and high-stakes debates of 2024- that many members of community find themselves having.
I agree the conclusion isn’t great!
Not so surprisingly, many people read the last section as an endorsement of some version of “RCTism”, but it’s not actually a view I endorse myself.
What I really wanted to get at in this post was just how pervasive priors are, and how difficult it is to see past them.