Epistemic Circularity
Note: I wrote a better article about this; you should read that instead of this stub: The Problem of the Criterion.
I’ve previously argued for the existence of what I’ve called a “free variable” in epistemology that forces a choice between ways of knowing because no one way of knowing (system of epistemology or simply an epistemology) can be both complete and consistent. In the process of working on a current project and not wanting to have to rederive anything someone else has already argued for in academic literature, I discovered this feature already has a name and has been written about: epistemic circularity.
I find it sort of surprising we’ve not addressed this more within the LW community, although it’s perhaps less surprising than might otherwise be expected given LW’s positivist leanings. I don’t have much to say on epistemic circularity at the moment, although I do consider it critical to my worldview and a crux of my thinking about philosophical conservatism for alignment research, but I did at least want to bring some wider attention to a concept that, to my recollection, we’ve ignored as a community.
- Zen and Rationality: Trust in Mind by 11 Aug 2020 20:23 UTC; 25 points) (
- Robustness to fundamental uncertainty in AGI alignment by 27 Jul 2018 0:41 UTC; 7 points) (
- 3 Sep 2018 23:02 UTC; 2 points) 's comment on Moral realism and AI alignment by (
- 1 Mar 2023 18:55 UTC; 2 points) 's comment on Teleosemantics! by (
If you want to spread the use of a piece of terminology, I would strongly suggest providing an explicit definition in this post as many people won’t want to have to click through to an article to figure out what you are talking about. This is particularly important when the article is paywalled.
It’s worse than that: epistemology influences epistemology. You can’t judge an epistemology without an epistemology.