I feel a certain sense of mental struggle when considering whether to torrent music. I don’t feel this same sense of mental struggle when considering whether or not to murder or steal or cheat . Although both of these are situations that call my personal morality, the torrenting situation seems to be an interesting special case.
We need a word to define the way in which the torrenting situation is a special case and not just another case where I don’t murder or steal or cheat because I’m not that kind of person. The majority of the English-speaking world seems to use “willpower”. As far as I know there’s no other definition of willpower, where we could say “Oh, that’s real willpower, this torrenting thing is something else.” If we didn’t have the word “willpower”, we’d have to make up a different word, like “conscious-alignment in mental struggle” or something.
Suppose that you have one extra ticket to the Grand Galloping Gala, and you have several friends who each want it desperately. You can give it to only one of them. Doesn’t the agonizing over that decision feel a lot like the agonizing over whether to buy or torrent? Yet we don’t think of that as involving willpower.
At the risk of totally reducing this to unsupportable subjective intuitions...no, the two decisions wouldn’t feel the same at all.
I can think of some cases in which it would feel similar. If one of the ticket-seekers was my best friend whom I’d known forever, and another was a girl I was trying to impress, and I had to decide between loyalty to my best friend or personal gain from impressing the girl. Or if one of the ticket-seekers had an incurable disease and this was her last chance to enjoy herself, and the other was a much better friend and much more fun to be around. But both of these are, in some way, moral issues.
In the simple ticket-seeker case without any of these complications, there would be a tough decision, but it would be symmetrical: there would be certain reasons for choosing Friend A, and certain others for choosing Friend B, and I could just decide between them. In the torrenting case, and the complicated ticket-seeker cases, it feels asymmetrical: like I have a better nature tending toward one side, and temptation tending toward the other side. This asymmetry seems to be the uniting factor behind my feeling of needing “willpower” for some decisions.
So, OK, to establish some context first: one (ridiculously oversimplified) way of modeling situations like this is to say that in both cases I have two valuation functions, F1 and F2, which give different results when comparing the expected value of the two choices, because they weight the relevant factors differently (for example, the relative merits of being aligned with my better nature and giving in to temptation, or the relative merits of choosing friend A and friend B), but in the first (simple) case the two functions are well-integrated and I can therefore easily calculate the weighted average of them, and in the second (complicated) case the two functions are poorly integrated and averaging their results is therefore more difficult. So by the time the results become available to consciousness in the first case, I’ve already made the decision, so I feel like I can “just decide” whereas in the second case, I haven’t yet, and therefore feel like I have a difficult decision to make, and the difference is really one of how aware I am of the decision. (There are a lot of situations like this, where when an operation can be performed without conscious monitoring it “feels easy.”)
So. In both cases the decision is “asymmetrical,” in that F1 and F2 are different functions, but in the torrenting case (and the complicated ticket case), the difference between F1 and F2 is associated with a moral judgment (leading to words like “better nature” and “temptation”). Which feels very significant, because we’re wired to attribute significance to moral judgments.
I wonder how far one could get by treating it like any other decision procedure, though. For example, if I decide explicitly that I weight “giving into temptation” and “following my better nature” with a ratio of 1:3, and I flip coins accordingly to determine whether to torrent or not (and adjust my weighting over time if I’m not liking the overall results)… do I still need so much “willpower”?
I love the idea of the coin-flip diet. Although it can be gamed by proposing to eat things more often.
Maybe you could roll a 6-sided die for each meal. 1 = oatmeal and prune juice, 2-3 = lentil soup, 4-5 = teriyaki chicken, 6 = Big Mac or ice cream.
If you know the weight, and you have a way of sorting the things you would flip a coin for, you can use the sorting order instead. For instance, I typically buy rather than torrent if the artist is in the bottom half of artists sorted by income.
I diet more or less this way. Not a coinflip, but a distribution that seems sustainable in the long term. Lentil soup twice a week, Big Mac and ice cream once a week, so to speak.
Or, if I wanted to choose between a car with good gas mileage and one with good performance, that could seem moral. Or if I were choosing between a food high in sugar, or one high in protein. Or one high in potassium, or one high in calcium.
Let me rephrase my objection, then.
I feel a certain sense of mental struggle when considering whether to torrent music. I don’t feel this same sense of mental struggle when considering whether or not to murder or steal or cheat . Although both of these are situations that call my personal morality, the torrenting situation seems to be an interesting special case.
We need a word to define the way in which the torrenting situation is a special case and not just another case where I don’t murder or steal or cheat because I’m not that kind of person. The majority of the English-speaking world seems to use “willpower”. As far as I know there’s no other definition of willpower, where we could say “Oh, that’s real willpower, this torrenting thing is something else.” If we didn’t have the word “willpower”, we’d have to make up a different word, like “conscious-alignment in mental struggle” or something.
So why not use the word “willpower” here?
Suppose that you have one extra ticket to the Grand Galloping Gala, and you have several friends who each want it desperately. You can give it to only one of them. Doesn’t the agonizing over that decision feel a lot like the agonizing over whether to buy or torrent? Yet we don’t think of that as involving willpower.
At the risk of totally reducing this to unsupportable subjective intuitions...no, the two decisions wouldn’t feel the same at all.
I can think of some cases in which it would feel similar. If one of the ticket-seekers was my best friend whom I’d known forever, and another was a girl I was trying to impress, and I had to decide between loyalty to my best friend or personal gain from impressing the girl. Or if one of the ticket-seekers had an incurable disease and this was her last chance to enjoy herself, and the other was a much better friend and much more fun to be around. But both of these are, in some way, moral issues.
In the simple ticket-seeker case without any of these complications, there would be a tough decision, but it would be symmetrical: there would be certain reasons for choosing Friend A, and certain others for choosing Friend B, and I could just decide between them. In the torrenting case, and the complicated ticket-seeker cases, it feels asymmetrical: like I have a better nature tending toward one side, and temptation tending toward the other side. This asymmetry seems to be the uniting factor behind my feeling of needing “willpower” for some decisions.
Mm.
So, OK, to establish some context first: one (ridiculously oversimplified) way of modeling situations like this is to say that in both cases I have two valuation functions, F1 and F2, which give different results when comparing the expected value of the two choices, because they weight the relevant factors differently (for example, the relative merits of being aligned with my better nature and giving in to temptation, or the relative merits of choosing friend A and friend B), but in the first (simple) case the two functions are well-integrated and I can therefore easily calculate the weighted average of them, and in the second (complicated) case the two functions are poorly integrated and averaging their results is therefore more difficult. So by the time the results become available to consciousness in the first case, I’ve already made the decision, so I feel like I can “just decide” whereas in the second case, I haven’t yet, and therefore feel like I have a difficult decision to make, and the difference is really one of how aware I am of the decision. (There are a lot of situations like this, where when an operation can be performed without conscious monitoring it “feels easy.”)
So. In both cases the decision is “asymmetrical,” in that F1 and F2 are different functions, but in the torrenting case (and the complicated ticket case), the difference between F1 and F2 is associated with a moral judgment (leading to words like “better nature” and “temptation”). Which feels very significant, because we’re wired to attribute significance to moral judgments.
I wonder how far one could get by treating it like any other decision procedure, though. For example, if I decide explicitly that I weight “giving into temptation” and “following my better nature” with a ratio of 1:3, and I flip coins accordingly to determine whether to torrent or not (and adjust my weighting over time if I’m not liking the overall results)… do I still need so much “willpower”?
I love the idea of the coin-flip diet. Although it can be gamed by proposing to eat things more often. Maybe you could roll a 6-sided die for each meal. 1 = oatmeal and prune juice, 2-3 = lentil soup, 4-5 = teriyaki chicken, 6 = Big Mac or ice cream.
If you know the weight, and you have a way of sorting the things you would flip a coin for, you can use the sorting order instead. For instance, I typically buy rather than torrent if the artist is in the bottom half of artists sorted by income.
I diet more or less this way. Not a coinflip, but a distribution that seems sustainable in the long term. Lentil soup twice a week, Big Mac and ice cream once a week, so to speak.
Or, if I wanted to choose between a car with good gas mileage and one with good performance, that could seem moral. Or if I were choosing between a food high in sugar, or one high in protein. Or one high in potassium, or one high in calcium.
What’s an example of an amoral choice?
Choosing between two cars with equally good gas mileage and performance, one which has more trunk space and one which has a roof rack.