What would an ideal epistemology be? I’m not asking for the ideal epistemology itself, but just how could you tell whether you’d developed one? Or if you were at least getting closer to it?
It kind of depends what you mean by “epistemology”. I was cheating a bit when I said that: many philosophers seem to think that epistemology is simply about studying the concept of knowledge as used by human beings. However, you might also think that perhaps what we’re really interested in is how to get useful information about the world.
In that case the human concept of “knowledge” seems pretty shitty: it’s binary, and has a whole host of subtle complications of usage. Whereas something like a Bayesian approach seems much better.
So I’m claiming that philosophers aren’t necessarily interested in the latter kind of epistemology; they’re interested in “knowledge” as most humans use it, rather than whatever epistemic concepts you would build into a new agent!
‘Ideal’ is underdetermined here, but we could give it content. I can imagine four basic families of ways to evaluate an epistemology (in addition to combinations):
Territorial: How useful is the epistemology for causing agents to consistently assert truths and deny falsehoods?
Epistemically Rational: How useful is the epistemology for causing agents to believe things in proportion to the strength of the available evidence? This may be a special case of the territorial evaluation, defined so as to exclude gerrymandered epistemologies that only help their agents by coincidence.
Instrumentally Rational: How useful is the epistemology for causing agents employing it to attain their personal goals?
Moral: How useful is the epistemology for satisfying everyone’s preferences, including the preferences of people who may not subscribe to the epistemology themselves?
What would an ideal epistemology be? I’m not asking for the ideal epistemology itself, but just how could you tell whether you’d developed one? Or if you were at least getting closer to it?
What would an ideal epistemology be? I’m not asking for the ideal epistemology itself, but just how could you tell whether you’d developed one? Or if you were at least getting closer to it?
It kind of depends what you mean by “epistemology”. I was cheating a bit when I said that: many philosophers seem to think that epistemology is simply about studying the concept of knowledge as used by human beings. However, you might also think that perhaps what we’re really interested in is how to get useful information about the world.
In that case the human concept of “knowledge” seems pretty shitty: it’s binary, and has a whole host of subtle complications of usage. Whereas something like a Bayesian approach seems much better.
So I’m claiming that philosophers aren’t necessarily interested in the latter kind of epistemology; they’re interested in “knowledge” as most humans use it, rather than whatever epistemic concepts you would build into a new agent!
‘Ideal’ is underdetermined here, but we could give it content. I can imagine four basic families of ways to evaluate an epistemology (in addition to combinations):
Territorial: How useful is the epistemology for causing agents to consistently assert truths and deny falsehoods?
Epistemically Rational: How useful is the epistemology for causing agents to believe things in proportion to the strength of the available evidence? This may be a special case of the territorial evaluation, defined so as to exclude gerrymandered epistemologies that only help their agents by coincidence.
Instrumentally Rational: How useful is the epistemology for causing agents employing it to attain their personal goals?
Moral: How useful is the epistemology for satisfying everyone’s preferences, including the preferences of people who may not subscribe to the epistemology themselves?
This is a good question for Eliezer Yudkowsky, since he seems to think Objective Bayesianism is it.