Sure, but I don’t think I’d describe that as a deficiency of epistemic rationality. People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
Someone who’s irrationally predisposed to interpret others’ behavior as hostility, and who nonetheless strives with partial success to overcome that predisposition, is displaying better epistemic rationality skills than Joe Sixpack despite worse instrumental outcomes.
People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
I am confused.
If a paranoiac has “unusually strong biases” but is exceptionally good at compensating for them, he would not be diagnosed with paranoia and would not be considered mentally ill.
My understanding of epistemic rationality is pretty simple: it is the degree to which your mental model matches the reality, full stop. It does not care how bad your biases are or how good are you at overcoming them, all that matters is the final result.
I also don’t think full-blown clinical paranoia is a “bias”—I think it is exactly a wrong picture of reality that fails epistemic rationality.
I think you’re modeling epistemic rationality as an externally assessed attribute and I’m modeling it as a skill.
Not really. I am modeling epistemic rationality as a sum total of skill, and biases, and willingness to look for quality evidence, and ability to find such evidence, etc. It is all the constituent parts which eventually produce the final model-of-the-world.
And that final model-of-the-world is what you called “an externally assessed attribute”, but it is the result of epistemic rationality, not the thing itself.
Sure, but I don’t think I’d describe that as a deficiency of epistemic rationality. People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
Someone who’s irrationally predisposed to interpret others’ behavior as hostility, and who nonetheless strives with partial success to overcome that predisposition, is displaying better epistemic rationality skills than Joe Sixpack despite worse instrumental outcomes.
I am confused.
If a paranoiac has “unusually strong biases” but is exceptionally good at compensating for them, he would not be diagnosed with paranoia and would not be considered mentally ill.
My understanding of epistemic rationality is pretty simple: it is the degree to which your mental model matches the reality, full stop. It does not care how bad your biases are or how good are you at overcoming them, all that matters is the final result.
I also don’t think full-blown clinical paranoia is a “bias”—I think it is exactly a wrong picture of reality that fails epistemic rationality.
I think you’re modeling epistemic rationality as an externally assessed attribute and I’m modeling it as a skill.
Not really. I am modeling epistemic rationality as a sum total of skill, and biases, and willingness to look for quality evidence, and ability to find such evidence, etc. It is all the constituent parts which eventually produce the final model-of-the-world.
And that final model-of-the-world is what you called “an externally assessed attribute”, but it is the result of epistemic rationality, not the thing itself.
So… you’d have a skill modifier plus or minus an ability modifier, and paranoiacs have a giant unrelated penalty?