Yes, if all the assumptions you made hold (also, no declining marginal utility for any ice cream flavour, no preference for variety for variety’s sake, and similar), then I would call eating strawberry ice-cream irrational.
(How likely these assumptions are to be a reasonable approximation to a scenario in real life, that’s another story; for example, people get bored when they always do the same things.)
Okay, thanks. So the ‘should’ of ‘you should save the children (if you want to)’ is a ‘should’ of rationality. Now do I have any reason at all to be rational in this way, or do I just have reason to get the thing I want (i.e. by reason of wanting it).
I mean that if I want X, and this is a reason to get X, do I have another reason to get X, namely that to do so would be rational and to fail to do so would be irrational?
You appear to be failing the twelfth virtue. Rationality is that which leads you to systematically get what you want, not some additional thing you might want in itself.
Hmm, so this seems like a problematic thing to tell someone: if I listen to you, then I’m going to be changing my mind about an object level question (“do we have reasons to be rational?”) because taking a certain position on that question violates a ‘virtue of rationality’. So if I do heed your warning, I fail in the very same way. If I don’t, then I’m stuck in my original failure.
But fair enough, I can’t think of a way to defend the idea of having reasons (specifically) to be rational at the moment.
I don’t know for sure that I do care; I got started on this line of questioning by asking a moral nihilist (if that’s accurate) what they meant by ‘should’ in the claim that if you want to save kids, you should save them. Turns out, the consequent of that sentence is pleonastic with the antecedent.
I’d, probably like you, raise doubts as to what the difference could be between being rational on a given occasion, and getting the highest expected return. On the other hand, I don’t entirely trust my preferences, and the best way to represent the gap between what I want and what I should want seems to be by using words like ‘rationality’, ‘truth’ and maybe ‘goodness’. If you asked me to choose between the morally right thing, and the thing that maximises my own standard of moral value, I’d unhesitatingly go for the former.
So I agree that there’s some absurdity in distinguishing in some particular case between rationality and a particular choice that maximises expected (objective) value. It may be wrong to conclude from this that we can eschew mention of rationality in our actual decision making though: the home of that term may be as a goal or aim, rather than as something standing along-side a particular decision. Once the rational decision has been arrived at, it’s identical with ‘rationality’. Until then, rationality is the ideal that guides you there. Something like that.
Yes, if all the assumptions you made hold (also, no declining marginal utility for any ice cream flavour, no preference for variety for variety’s sake, and similar), then I would call eating strawberry ice-cream irrational.
(How likely these assumptions are to be a reasonable approximation to a scenario in real life, that’s another story; for example, people get bored when they always do the same things.)
Okay, thanks. So the ‘should’ of ‘you should save the children (if you want to)’ is a ‘should’ of rationality. Now do I have any reason at all to be rational in this way, or do I just have reason to get the thing I want (i.e. by reason of wanting it).
I mean that if I want X, and this is a reason to get X, do I have another reason to get X, namely that to do so would be rational and to fail to do so would be irrational?
You appear to be failing the twelfth virtue. Rationality is that which leads you to systematically get what you want, not some additional thing you might want in itself.
Hmm, so this seems like a problematic thing to tell someone: if I listen to you, then I’m going to be changing my mind about an object level question (“do we have reasons to be rational?”) because taking a certain position on that question violates a ‘virtue of rationality’. So if I do heed your warning, I fail in the very same way. If I don’t, then I’m stuck in my original failure.
But fair enough, I can’t think of a way to defend the idea of having reasons (specifically) to be rational at the moment.
I—I don’t know what to answer at this point—Do you have any idea how you came to care about being rational in the first place?
Would you rather be the one who did what you think of as rational, or the one who is currently smiling from on top of a giant heap of utility? (Too bad that post must use such a potentially controversial example...)
I don’t know for sure that I do care; I got started on this line of questioning by asking a moral nihilist (if that’s accurate) what they meant by ‘should’ in the claim that if you want to save kids, you should save them. Turns out, the consequent of that sentence is pleonastic with the antecedent.
I’d, probably like you, raise doubts as to what the difference could be between being rational on a given occasion, and getting the highest expected return. On the other hand, I don’t entirely trust my preferences, and the best way to represent the gap between what I want and what I should want seems to be by using words like ‘rationality’, ‘truth’ and maybe ‘goodness’. If you asked me to choose between the morally right thing, and the thing that maximises my own standard of moral value, I’d unhesitatingly go for the former.
So I agree that there’s some absurdity in distinguishing in some particular case between rationality and a particular choice that maximises expected (objective) value. It may be wrong to conclude from this that we can eschew mention of rationality in our actual decision making though: the home of that term may be as a goal or aim, rather than as something standing along-side a particular decision. Once the rational decision has been arrived at, it’s identical with ‘rationality’. Until then, rationality is the ideal that guides you there. Something like that.