I might actually expect the opposite. Mental illness is conventionally defined in terms of a marked impairment of everyday cognition compared to the general population, i.e. instrumental irrationality. Yet relatively few mental disorders, from my reading, seem to have direct effects on an epistemic level.
Yet relatively few mental disorders, from my reading, seem to have direct effects on an epistemic level.
That sounds very strange to me. The standard things like schizophrenia, paranoia, etc. are characterized precisely by a “wrong” picture of reality. If a paranoiac is unwilling to venture out to buy groceries, that’s not a failure of instrumental rationality—if there actually were people outside his door who want to kill him (as he believes), his behavior would be perfectly rational.
Sure, but I don’t think I’d describe that as a deficiency of epistemic rationality. People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
Someone who’s irrationally predisposed to interpret others’ behavior as hostility, and who nonetheless strives with partial success to overcome that predisposition, is displaying better epistemic rationality skills than Joe Sixpack despite worse instrumental outcomes.
People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
I am confused.
If a paranoiac has “unusually strong biases” but is exceptionally good at compensating for them, he would not be diagnosed with paranoia and would not be considered mentally ill.
My understanding of epistemic rationality is pretty simple: it is the degree to which your mental model matches the reality, full stop. It does not care how bad your biases are or how good are you at overcoming them, all that matters is the final result.
I also don’t think full-blown clinical paranoia is a “bias”—I think it is exactly a wrong picture of reality that fails epistemic rationality.
I think you’re modeling epistemic rationality as an externally assessed attribute and I’m modeling it as a skill.
Not really. I am modeling epistemic rationality as a sum total of skill, and biases, and willingness to look for quality evidence, and ability to find such evidence, etc. It is all the constituent parts which eventually produce the final model-of-the-world.
And that final model-of-the-world is what you called “an externally assessed attribute”, but it is the result of epistemic rationality, not the thing itself.
I might actually expect the opposite. Mental illness is conventionally defined in terms of a marked impairment of everyday cognition compared to the general population, i.e. instrumental irrationality. Yet relatively few mental disorders, from my reading, seem to have direct effects on an epistemic level.
That sounds very strange to me. The standard things like schizophrenia, paranoia, etc. are characterized precisely by a “wrong” picture of reality. If a paranoiac is unwilling to venture out to buy groceries, that’s not a failure of instrumental rationality—if there actually were people outside his door who want to kill him (as he believes), his behavior would be perfectly rational.
Even depression has a strong epistemic component.
Sure, but I don’t think I’d describe that as a deficiency of epistemic rationality. People dealing with disorders like paranoia clearly have unusually strong biases (or other issues, in the case of schizophrenia) to deal with, but exceptional epistemic rationality consists of compensating well for your biases, not of not having any in the first place.
Someone who’s irrationally predisposed to interpret others’ behavior as hostility, and who nonetheless strives with partial success to overcome that predisposition, is displaying better epistemic rationality skills than Joe Sixpack despite worse instrumental outcomes.
I am confused.
If a paranoiac has “unusually strong biases” but is exceptionally good at compensating for them, he would not be diagnosed with paranoia and would not be considered mentally ill.
My understanding of epistemic rationality is pretty simple: it is the degree to which your mental model matches the reality, full stop. It does not care how bad your biases are or how good are you at overcoming them, all that matters is the final result.
I also don’t think full-blown clinical paranoia is a “bias”—I think it is exactly a wrong picture of reality that fails epistemic rationality.
I think you’re modeling epistemic rationality as an externally assessed attribute and I’m modeling it as a skill.
Not really. I am modeling epistemic rationality as a sum total of skill, and biases, and willingness to look for quality evidence, and ability to find such evidence, etc. It is all the constituent parts which eventually produce the final model-of-the-world.
And that final model-of-the-world is what you called “an externally assessed attribute”, but it is the result of epistemic rationality, not the thing itself.
So… you’d have a skill modifier plus or minus an ability modifier, and paranoiacs have a giant unrelated penalty?