Well, wouldn’t it be great if we had sound metaphilosophical principles that help us distinguish epistemic pits from correct conclusions! :P
I actually think humanity is in a bunch of epistemic pits that we mostly aren’t even aware of. For example, if you share my view that Buddhist enlightenment carries significant (albeit hard-to-articulate) epistemic content, then basically all of humanity over basically all of time has been in the epistemic pit of non-enlightenment.
If we figure out the metaphilosophy of how to robustly avoid epistemic pits, and build that into an aligned AGI, then in some sense none of our current epistemic pits are that bad, since that AGI would help us climb out in relatively short order. But if we don’t figure it out, we’ll plausibly stay in our epistemic pits for unacceptably long periods of time.
Well, wouldn’t it be great if we had sound metaphilosophical principles that help us distinguish epistemic pits from correct conclusions! :P
I actually think humanity is in a bunch of epistemic pits that we mostly aren’t even aware of. For example, if you share my view that Buddhist enlightenment carries significant (albeit hard-to-articulate) epistemic content, then basically all of humanity over basically all of time has been in the epistemic pit of non-enlightenment.
If we figure out the metaphilosophy of how to robustly avoid epistemic pits, and build that into an aligned AGI, then in some sense none of our current epistemic pits are that bad, since that AGI would help us climb out in relatively short order. But if we don’t figure it out, we’ll plausibly stay in our epistemic pits for unacceptably long periods of time.