Hmm, so this seems like a problematic thing to tell someone: if I listen to you, then I’m going to be changing my mind about an object level question (“do we have reasons to be rational?”) because taking a certain position on that question violates a ‘virtue of rationality’. So if I do heed your warning, I fail in the very same way. If I don’t, then I’m stuck in my original failure.
But fair enough, I can’t think of a way to defend the idea of having reasons (specifically) to be rational at the moment.
I don’t know for sure that I do care; I got started on this line of questioning by asking a moral nihilist (if that’s accurate) what they meant by ‘should’ in the claim that if you want to save kids, you should save them. Turns out, the consequent of that sentence is pleonastic with the antecedent.
I’d, probably like you, raise doubts as to what the difference could be between being rational on a given occasion, and getting the highest expected return. On the other hand, I don’t entirely trust my preferences, and the best way to represent the gap between what I want and what I should want seems to be by using words like ‘rationality’, ‘truth’ and maybe ‘goodness’. If you asked me to choose between the morally right thing, and the thing that maximises my own standard of moral value, I’d unhesitatingly go for the former.
So I agree that there’s some absurdity in distinguishing in some particular case between rationality and a particular choice that maximises expected (objective) value. It may be wrong to conclude from this that we can eschew mention of rationality in our actual decision making though: the home of that term may be as a goal or aim, rather than as something standing along-side a particular decision. Once the rational decision has been arrived at, it’s identical with ‘rationality’. Until then, rationality is the ideal that guides you there. Something like that.
Hmm, so this seems like a problematic thing to tell someone: if I listen to you, then I’m going to be changing my mind about an object level question (“do we have reasons to be rational?”) because taking a certain position on that question violates a ‘virtue of rationality’. So if I do heed your warning, I fail in the very same way. If I don’t, then I’m stuck in my original failure.
But fair enough, I can’t think of a way to defend the idea of having reasons (specifically) to be rational at the moment.
I—I don’t know what to answer at this point—Do you have any idea how you came to care about being rational in the first place?
Would you rather be the one who did what you think of as rational, or the one who is currently smiling from on top of a giant heap of utility? (Too bad that post must use such a potentially controversial example...)
I don’t know for sure that I do care; I got started on this line of questioning by asking a moral nihilist (if that’s accurate) what they meant by ‘should’ in the claim that if you want to save kids, you should save them. Turns out, the consequent of that sentence is pleonastic with the antecedent.
I’d, probably like you, raise doubts as to what the difference could be between being rational on a given occasion, and getting the highest expected return. On the other hand, I don’t entirely trust my preferences, and the best way to represent the gap between what I want and what I should want seems to be by using words like ‘rationality’, ‘truth’ and maybe ‘goodness’. If you asked me to choose between the morally right thing, and the thing that maximises my own standard of moral value, I’d unhesitatingly go for the former.
So I agree that there’s some absurdity in distinguishing in some particular case between rationality and a particular choice that maximises expected (objective) value. It may be wrong to conclude from this that we can eschew mention of rationality in our actual decision making though: the home of that term may be as a goal or aim, rather than as something standing along-side a particular decision. Once the rational decision has been arrived at, it’s identical with ‘rationality’. Until then, rationality is the ideal that guides you there. Something like that.