My ambition doesn’t go as far as solving epistemology. But it seems to me that the problem of criterion relies on the sentence “everything requires justification”, which sounds wrong. I believe a different sentence instead: “everything except cousin_it’s axioms requires justification”. Call me self-serving, but it just sounds so right, and I can’t seem to derive a contradiction from it! :-)
Of course things are more fuzzy in practice. The “axioms” are more like laws, with provisions for amending themselves. So your beliefs could also be described as “everything except TAG’s axioms requires justification” etc. And a group of people can have shared knowledge justified by the common subset of their axioms, without appealing to anything universal.
That still leaves the question of what our axioms/laws actually say and where their self-amendment leads. But I suspect the answer to that is complicated and path-dependent, like everything else about us.
Epistemology remains unsolved because dropping the requirement knowledge (not “everything”) to be justified created further problems. In particular, treating individual axioms sets as true leads to relativism.
What does your epistemology recommend for others? For example, should I:
1. treat cousin_it’s axioms as true?
2. treat Ikaxas’s axioms as true?
3. Something else?
If the first, why should the rule be
C: For all x, x should treat cousin_it’s axioms as true
rather than say “treat TAG’s axioms as true” or “treat Barack Obama’s axioms as true” or “treat Joe Schmoe’s axioms as true”? Don’t symmetry considerations speak against this “epistemological egoism”?
If the second, then the rule seems to be
A: For all x, x should treat x’s axioms as true.
This is pretty close to relativism. Granted, A is not relativism—Relativism would be
R: For all x, x’s axioms are true for x
--but it is fairly close. For all that, it may in fact be the best rule from a pragmatic perspective.
To put this in map-territory terminology: the best rule, pragmatically speaking, may be, “for all x, x should follow x’s map” (i.e. A), since x doesn’t really have unmediated access to other people’s maps, or to the territory. But the rule could not be “for all x, the territory corresponds to x’s map,” (i.e. R) since this would either imply that there is a territory for each person, when in fact there is only one territory, or it would imply that the territory contains contradictions, since some people’s maps contain P and others’ contain not-P.
Alternatively, perhaps your epistemology only makes a recommendation for you, cousin_it, and doesn’t tell others what they should believe. But in that case it’s not complete.
Also, it’s not clear to me what “everything except cousin_it’s axioms requires justification” has to do with the original statement that “knowledge is relative to a particular process.” That statement certainly seems like it could be open to the charge of relativism.
Of course my honest advice to others is that they should follow my axioms! :-)
For example, let’s say I have an axiom that the winning lottery number is either 1234 or 4321. You’re thinking about playing the lottery, and have an opportunity to snoop the first digit of the winning number before making a bet. Then in my opinion, the best strategy for you to achieve your goal is to bet on 1234 if you learn that the first digit is 1, or bet on 4321 if you learn that the first digit is 4. Learning any other first digit is in my opinion impossible for you, so I don’t have any advice for that case. And if you have an axiom of your own, saying the winning number is either 1111 or 4444, then in my opinion you’re mistaken and following your axiom won’t let you achieve your goal.
That seems like the only reasonable way to think about beliefs, no matter if they are axioms or derived beliefs. Symmetry considerations do matter, but only to the extent they are themselves part of beliefs.
But I’m not sure why my advice on what you should do is relevant to you. After all, if I’m right, you will follow your own axioms and use them to interpret anything I say. The only beliefs we can agree on are beliefs that agree with both our axiom sets. Hopefully that’s enough to include the following belief: “a person can consistently believe some set of axioms without suffering from the problem of criterion”. Moreover, we know how to build artificial reasoners based on axioms (e.g. a theorem prover) but we don’t know any other way, so it seems likely that people also work that way.
There isn’t much content there. I don’t think it’s impossible to solve epistemology, but I don’t think it’s possible in one sentence.
My ambition doesn’t go as far as solving epistemology. But it seems to me that the problem of criterion relies on the sentence “everything requires justification”, which sounds wrong. I believe a different sentence instead: “everything except cousin_it’s axioms requires justification”. Call me self-serving, but it just sounds so right, and I can’t seem to derive a contradiction from it! :-)
Of course things are more fuzzy in practice. The “axioms” are more like laws, with provisions for amending themselves. So your beliefs could also be described as “everything except TAG’s axioms requires justification” etc. And a group of people can have shared knowledge justified by the common subset of their axioms, without appealing to anything universal.
That still leaves the question of what our axioms/laws actually say and where their self-amendment leads. But I suspect the answer to that is complicated and path-dependent, like everything else about us.
Epistemology remains unsolved because dropping the requirement knowledge (not “everything”) to be justified created further problems. In particular, treating individual axioms sets as true leads to relativism.
I only treat my axiom set as true. Is that relativism? What problems does it lead to?
What does your epistemology recommend for others? For example, should I:
1. treat cousin_it’s axioms as true?
2. treat Ikaxas’s axioms as true?
3. Something else?
If the first, why should the rule be
rather than say “treat TAG’s axioms as true” or “treat Barack Obama’s axioms as true” or “treat Joe Schmoe’s axioms as true”? Don’t symmetry considerations speak against this “epistemological egoism”?
If the second, then the rule seems to be
This is pretty close to relativism. Granted, A is not relativism—Relativism would be
--but it is fairly close. For all that, it may in fact be the best rule from a pragmatic perspective.
To put this in map-territory terminology: the best rule, pragmatically speaking, may be, “for all x, x should follow x’s map” (i.e. A), since x doesn’t really have unmediated access to other people’s maps, or to the territory. But the rule could not be “for all x, the territory corresponds to x’s map,” (i.e. R) since this would either imply that there is a territory for each person, when in fact there is only one territory, or it would imply that the territory contains contradictions, since some people’s maps contain P and others’ contain not-P.
Alternatively, perhaps your epistemology only makes a recommendation for you, cousin_it, and doesn’t tell others what they should believe. But in that case it’s not complete.
Also, it’s not clear to me what “everything except cousin_it’s axioms requires justification” has to do with the original statement that “knowledge is relative to a particular process.” That statement certainly seems like it could be open to the charge of relativism.
Of course my honest advice to others is that they should follow my axioms! :-)
For example, let’s say I have an axiom that the winning lottery number is either 1234 or 4321. You’re thinking about playing the lottery, and have an opportunity to snoop the first digit of the winning number before making a bet. Then in my opinion, the best strategy for you to achieve your goal is to bet on 1234 if you learn that the first digit is 1, or bet on 4321 if you learn that the first digit is 4. Learning any other first digit is in my opinion impossible for you, so I don’t have any advice for that case. And if you have an axiom of your own, saying the winning number is either 1111 or 4444, then in my opinion you’re mistaken and following your axiom won’t let you achieve your goal.
That seems like the only reasonable way to think about beliefs, no matter if they are axioms or derived beliefs. Symmetry considerations do matter, but only to the extent they are themselves part of beliefs.
But I’m not sure why my advice on what you should do is relevant to you. After all, if I’m right, you will follow your own axioms and use them to interpret anything I say. The only beliefs we can agree on are beliefs that agree with both our axiom sets. Hopefully that’s enough to include the following belief: “a person can consistently believe some set of axioms without suffering from the problem of criterion”. Moreover, we know how to build artificial reasoners based on axioms (e.g. a theorem prover) but we don’t know any other way, so it seems likely that people also work that way.