Is there such a thing as a desire that’s actually just stupid? I’m not sure that’s possible in a world where humans are just one possible sort of evolved intelligence and metamorality is per-species. But within humanity, can we say someone’s values are in fact stupid? Irrational?
One possible example suggested by Parfit is “FutureTuesday Indifferance”. Suppose Bob cares about his future self and in particular prefers not to have pain in the future, as normal people do—except for any future Tuesday. He does not care at if he is going to suffer pain in a future Tuesday. He is happy to choose on Sunday to have an extremely painful operation scheduled for Tuesday instead of a mild one for either Monday or Wednesday. It is not that he does not suffer the pain when Tuesday comes: he is only (irreducibly, arbitrarily) indifferent to pain on Tuesdays that are in the future. Parfit argues that Bob has irrational preferences.
Of course this is a very contrived and psychologically unrealistic example. But Parfit argues that, once it is granted that brute preferences can be irrational, then the question is open whether many of our actual ones are (maybe because they are subtly grounded on distinctions that when examined are as arbitrary as that between future Tuesday and Monday).
Yes. The obvious one to me is that it is totally irrational of me to want to eat pile of sweets that I know from previous experience will make me feel bad about myself ten minutes after eating it and which I rationally don’t need nutritionally. I can make myself not do it, but to make myself not want to is like trying to not see an optical illusion...
In my experience, there’s lots of ways to make myself not want to eat those sweets.
For example, I can get out of the house and go for a brisk walk, or better yet go to the gym and work out. IME, while I’m exercising I rarely find myself craving food of any sort unless I’m genuinely hungry. Or I can make myself a large portion of something else to eat, and eat it until I’m stuffed. IME, I don’t want to eat sweets when I’m actively full. Or I can go to sleep. IME, I don’t want to eat sweets while I’m sleeping. Or I can douse the sweets with urine. IME, I don’t want to eat sweets doused in urine. Or many other possibilities.
The problem is I don’t want to do any of those things, either. Which is a remarkable coincidence, when I stop to think about it. In fact, a neutral observer might conclude that I want to want to eat the sweets.
Wants are like pains: sometimes they’re useful information that something should be attended to, and sometimes they’re irrelevant distractions because there are more important things to do, and just have to be endured and otherwise ignored.
Suppose I value risk, and I value longevity, and I live in a world such that high-risk environments are always low-longevity and high-longevity environments are always low-risk.
I want to say something disparaging about that arrangement, but I’m not sure what it means to call it either “stupid” or “irrational”.
Certainly, it doesn’t follow from the fact that I have those values that I’m stupid.
It is certainly true that my behavior will predictably be less effective at optimizing my environment for my values in that case than it would be otherwise, but the same could be said for valuing happiness over paperclips.
(shrug) I think the word I want here is “inelegant.”
Required reading. And the other links here.
Is there such a thing as a desire that’s actually just stupid? I’m not sure that’s possible in a world where humans are just one possible sort of evolved intelligence and metamorality is per-species. But within humanity, can we say someone’s values are in fact stupid? Irrational?
One possible example suggested by Parfit is “FutureTuesday Indifferance”. Suppose Bob cares about his future self and in particular prefers not to have pain in the future, as normal people do—except for any future Tuesday. He does not care at if he is going to suffer pain in a future Tuesday. He is happy to choose on Sunday to have an extremely painful operation scheduled for Tuesday instead of a mild one for either Monday or Wednesday. It is not that he does not suffer the pain when Tuesday comes: he is only (irreducibly, arbitrarily) indifferent to pain on Tuesdays that are in the future. Parfit argues that Bob has irrational preferences.
Of course this is a very contrived and psychologically unrealistic example. But Parfit argues that, once it is granted that brute preferences can be irrational, then the question is open whether many of our actual ones are (maybe because they are subtly grounded on distinctions that when examined are as arbitrary as that between future Tuesday and Monday).
Yes. The obvious one to me is that it is totally irrational of me to want to eat pile of sweets that I know from previous experience will make me feel bad about myself ten minutes after eating it and which I rationally don’t need nutritionally. I can make myself not do it, but to make myself not want to is like trying to not see an optical illusion...
In my experience, there’s lots of ways to make myself not want to eat those sweets.
For example, I can get out of the house and go for a brisk walk, or better yet go to the gym and work out. IME, while I’m exercising I rarely find myself craving food of any sort unless I’m genuinely hungry.
Or I can make myself a large portion of something else to eat, and eat it until I’m stuffed. IME, I don’t want to eat sweets when I’m actively full.
Or I can go to sleep. IME, I don’t want to eat sweets while I’m sleeping.
Or I can douse the sweets with urine. IME, I don’t want to eat sweets doused in urine.
Or many other possibilities.
The problem is I don’t want to do any of those things, either.
Which is a remarkable coincidence, when I stop to think about it.
In fact, a neutral observer might conclude that I want to want to eat the sweets.
Wants are like pains: sometimes they’re useful information that something should be attended to, and sometimes they’re irrelevant distractions because there are more important things to do, and just have to be endured and otherwise ignored.
Hm.
Suppose I value risk, and I value longevity, and I live in a world such that high-risk environments are always low-longevity and high-longevity environments are always low-risk.
I want to say something disparaging about that arrangement, but I’m not sure what it means to call it either “stupid” or “irrational”.
Certainly, it doesn’t follow from the fact that I have those values that I’m stupid.
It is certainly true that my behavior will predictably be less effective at optimizing my environment for my values in that case than it would be otherwise, but the same could be said for valuing happiness over paperclips.
(shrug) I think the word I want here is “inelegant.”
Yes.
Non-metonymically? Why is this an interesting question?