If we restrict our observations to minds that are capable of functioning in a moderately complex environment, UCAs come back, at least in math and maybe elsewhere. Defining “functioning” isn’t trivial, but it isn’t impossible either. If the mind has something like desires, then a functioning mind is one which tends to get its desires more often than if it didn’t desire them.
But it may be in the mind’s best interests to refuse to be persuaded by some specific class of argument: “It is difficult to get a man to understand something when his job depends on not understanding it” (Upton Sinclair). For any supposed UCA, one can construct a situation in which a mind can rationally choose to ignore it and therefore achieve its objectives better, or at least not be majorly harmed by it. You don’t even need to construct particularly far-fetched scenarios: we already see plenty of humans who benefit from ignoring scientific arguments in favor of religious ones, ignoring unpopular but true claims in order to promote claims that make them more popular, etc.
For any supposed UCA, one can construct a situation in which a mind can rationally choose to ignore it and therefore achieve its objectives better, or at least not be majorly harmed by it.
I’m not convinced that this is the case for basic principles of epistemology. Under what circumstances could a mind (which behaved functionally enough to be called a mind) afford to ignore modus ponens, for example?
But it may be in the mind’s best interests to refuse to be persuaded by some specific class of argument: “It is difficult to get a man to understand something when his job depends on not understanding it” (Upton Sinclair). For any supposed UCA, one can construct a situation in which a mind can rationally choose to ignore it
Where rationally means “instrumentally rationally”.
ou don’t even need to construct particularly far-fetched scenarios: we already see plenty of humans who benefit from ignoring scientific arguments in favor of religious ones, ignoring unpopular but true claims in order to promote claims that make them more popular, etc.
But they are not generally considered paragons of rationality. In fact, they are biased, and bias is considered inimical to rationality. Even by EY. At least when he is discussing humans.
Given that dspeyer specified “minds that are capable of functioning in a moderately complex environment”, instrumental rationality seems like the relevant criteria to use.
But it may be in the mind’s best interests to refuse to be persuaded by some specific class of argument: “It is difficult to get a man to understand something when his job depends on not understanding it” (Upton Sinclair). For any supposed UCA, one can construct a situation in which a mind can rationally choose to ignore it and therefore achieve its objectives better, or at least not be majorly harmed by it. You don’t even need to construct particularly far-fetched scenarios: we already see plenty of humans who benefit from ignoring scientific arguments in favor of religious ones, ignoring unpopular but true claims in order to promote claims that make them more popular, etc.
I’m not convinced that this is the case for basic principles of epistemology. Under what circumstances could a mind (which behaved functionally enough to be called a mind) afford to ignore modus ponens, for example?
Well, it doesn’t have to, it could just deny the premises.
But it could deny modus ponens in some situations but not others.
Hmm. Like a person who is so afraid of dying that they have to convince themselves that they, personally, are immortal in order to remain sane?
From that perspective it does make sense.
That depends on what you mean by “behave functionally like a mind”. For starters it could only ignore it occasionally.
Where rationally means “instrumentally rationally”.
But they are not generally considered paragons of rationality. In fact, they are biased, and bias is considered inimical to rationality. Even by EY. At least when he is discussing humans.
Given that dspeyer specified “minds that are capable of functioning in a moderately complex environment”, instrumental rationality seems like the relevant criteria to use.
Not sure how that’s relevant, given that the discussion was never restricted to (hypothetical) completely rational minds.