I’m writing a book about epistemology. It’s about The Problem of the Criterion, why it’s important, and what it has to tell us about how we approach knowing the truth.
I’ve also written a lot about AI safety. Some of the more interesting stuff can be found at the site of my currently-dormant AI safety org, PAISRI.
Feels like this has too much wiggle room. Like what counts as an “easy” problem of consciousness and what counts as “transcending” it? Generally good definitions avoid words that either do too much work or invite judgement calls about what counts.