When I first read the sequences, I thought “What do I know and how do I think I know it?” was pretty banal and useless—didn’t everyone know that? Philosophy 101, question your beliefs, look for hidden assumptions, etc.
The older I get the more I come to think that no, not everyone knows this, and even the people who know it don’t practice it enough. I’m not sure though.
I think of “What do I know and how do I think I know it?” as the “root cause” of essentially all other epistemic rationality—i.e. if you’re sufficiently good at that one skill, all the others will follow naturally from it. Conversely, that suggests it’s really difficult to get really good at it: if I’m missing any other epistemic rationality skill, it means I’m not good enough at “What do I know and how do I think I know it?”.
I’d say the “obvious” version of the skill involves activities which look like questioning beliefs, looking for hidden assumptions, etc. But these are surface-level activities which don’t necessarily trace the whole belief-generating pipeline. The full skill is about modelling the entire physical process which created your map from the territory.
One example I’ve thought about recently: we’ve had a bunch of posts lately on simulacrum levels. Personally, I saw most of the ideas in those posts as kind-of-obvious applications of the general principle/habit “when you hear words, don’t ask what they literally signify, ask what physical process generated them and what that implies about the world”. (Or the HPMOR version: “Professor Quirrell didn’t care what your expression looked like, he cared which states of mind made it likely.”) This is a principle/habit which naturally pops out of modelling the physical process which produces your own beliefs, whenever someone’s words appear in the belief-production pipeline.
When I first read the sequences, I thought “What do I know and how do I think I know it?” was pretty banal and useless—didn’t everyone know that? Philosophy 101, question your beliefs, look for hidden assumptions, etc.
The older I get the more I come to think that no, not everyone knows this, and even the people who know it don’t practice it enough. I’m not sure though.
I think of “What do I know and how do I think I know it?” as the “root cause” of essentially all other epistemic rationality—i.e. if you’re sufficiently good at that one skill, all the others will follow naturally from it. Conversely, that suggests it’s really difficult to get really good at it: if I’m missing any other epistemic rationality skill, it means I’m not good enough at “What do I know and how do I think I know it?”.
I’d say the “obvious” version of the skill involves activities which look like questioning beliefs, looking for hidden assumptions, etc. But these are surface-level activities which don’t necessarily trace the whole belief-generating pipeline. The full skill is about modelling the entire physical process which created your map from the territory.
One example I’ve thought about recently: we’ve had a bunch of posts lately on simulacrum levels. Personally, I saw most of the ideas in those posts as kind-of-obvious applications of the general principle/habit “when you hear words, don’t ask what they literally signify, ask what physical process generated them and what that implies about the world”. (Or the HPMOR version: “Professor Quirrell didn’t care what your expression looked like, he cared which states of mind made it likely.”) This is a principle/habit which naturally pops out of modelling the physical process which produces your own beliefs, whenever someone’s words appear in the belief-production pipeline.