So, when I agonize over whether to torrent an expensive album instead of paying for it, and about half the time I end up torrenting it and feeling bad, and about half the time I pay for it but don’t enjoy doing so … what exactly am I doing in the latter case if not employing willpower?
I mean, I know willpower probably isn’t a real thing on the deepest levels of the brain, but it’s fake in the same way centrifugal force is fake, not in the way Bigfoot is fake. It sure feels like I’m using willpower when I make moral decisions about pirating, and I don’t understand how your model above interprets that.
Granted, there are many other moral decisions I make that don’t require willpower and do conform to your model above, and if I had to choose black-and-white between ethics-as-willpower or ethics-as-choice I’d take the latter, your model just doesn’t seem complete.
My interpretation of the post in this case is: it’s not that you’re not employing willpower, instead you’re not employing personal morality.
So, while TORRENT vs BUY fits into the societal ethics view, it does not fit into your personal morality.
From the personal morality perspective, the bad feeling you get is the thing you need willpower to fight against/suppress. You probably also need willpower to fight against/suppress the bad feeling you might be getting from buying the album.
These need not be mutually exclusive. Personal morality can be both against torrenting and against spending money unduly.
I feel a certain sense of mental struggle when considering whether to torrent music. I don’t feel this same sense of mental struggle when considering whether or not to murder or steal or cheat . Although both of these are situations that call my personal morality, the torrenting situation seems to be an interesting special case.
We need a word to define the way in which the torrenting situation is a special case and not just another case where I don’t murder or steal or cheat because I’m not that kind of person. The majority of the English-speaking world seems to use “willpower”. As far as I know there’s no other definition of willpower, where we could say “Oh, that’s real willpower, this torrenting thing is something else.” If we didn’t have the word “willpower”, we’d have to make up a different word, like “conscious-alignment in mental struggle” or something.
Suppose that you have one extra ticket to the Grand Galloping Gala, and you have several friends who each want it desperately. You can give it to only one of them. Doesn’t the agonizing over that decision feel a lot like the agonizing over whether to buy or torrent? Yet we don’t think of that as involving willpower.
At the risk of totally reducing this to unsupportable subjective intuitions...no, the two decisions wouldn’t feel the same at all.
I can think of some cases in which it would feel similar. If one of the ticket-seekers was my best friend whom I’d known forever, and another was a girl I was trying to impress, and I had to decide between loyalty to my best friend or personal gain from impressing the girl. Or if one of the ticket-seekers had an incurable disease and this was her last chance to enjoy herself, and the other was a much better friend and much more fun to be around. But both of these are, in some way, moral issues.
In the simple ticket-seeker case without any of these complications, there would be a tough decision, but it would be symmetrical: there would be certain reasons for choosing Friend A, and certain others for choosing Friend B, and I could just decide between them. In the torrenting case, and the complicated ticket-seeker cases, it feels asymmetrical: like I have a better nature tending toward one side, and temptation tending toward the other side. This asymmetry seems to be the uniting factor behind my feeling of needing “willpower” for some decisions.
So, OK, to establish some context first: one (ridiculously oversimplified) way of modeling situations like this is to say that in both cases I have two valuation functions, F1 and F2, which give different results when comparing the expected value of the two choices, because they weight the relevant factors differently (for example, the relative merits of being aligned with my better nature and giving in to temptation, or the relative merits of choosing friend A and friend B), but in the first (simple) case the two functions are well-integrated and I can therefore easily calculate the weighted average of them, and in the second (complicated) case the two functions are poorly integrated and averaging their results is therefore more difficult. So by the time the results become available to consciousness in the first case, I’ve already made the decision, so I feel like I can “just decide” whereas in the second case, I haven’t yet, and therefore feel like I have a difficult decision to make, and the difference is really one of how aware I am of the decision. (There are a lot of situations like this, where when an operation can be performed without conscious monitoring it “feels easy.”)
So. In both cases the decision is “asymmetrical,” in that F1 and F2 are different functions, but in the torrenting case (and the complicated ticket case), the difference between F1 and F2 is associated with a moral judgment (leading to words like “better nature” and “temptation”). Which feels very significant, because we’re wired to attribute significance to moral judgments.
I wonder how far one could get by treating it like any other decision procedure, though. For example, if I decide explicitly that I weight “giving into temptation” and “following my better nature” with a ratio of 1:3, and I flip coins accordingly to determine whether to torrent or not (and adjust my weighting over time if I’m not liking the overall results)… do I still need so much “willpower”?
I love the idea of the coin-flip diet. Although it can be gamed by proposing to eat things more often.
Maybe you could roll a 6-sided die for each meal. 1 = oatmeal and prune juice, 2-3 = lentil soup, 4-5 = teriyaki chicken, 6 = Big Mac or ice cream.
If you know the weight, and you have a way of sorting the things you would flip a coin for, you can use the sorting order instead. For instance, I typically buy rather than torrent if the artist is in the bottom half of artists sorted by income.
I diet more or less this way. Not a coinflip, but a distribution that seems sustainable in the long term. Lentil soup twice a week, Big Mac and ice cream once a week, so to speak.
Or, if I wanted to choose between a car with good gas mileage and one with good performance, that could seem moral. Or if I were choosing between a food high in sugar, or one high in protein. Or one high in potassium, or one high in calcium.
It all depends on why you decide to torrent/not torrent:
Are you more likely to torrent if the album is very expensive, or if it is very cheap? If you expect it to be of high quality, or of low quality? If the store you could buy the album at is far away, or very close? If you like the band that made it, or if you don’t like them? Longer albums or shorter? Would you torrent less if the punishment for doing so was increased? Would you torrent more if it was harder to get caught? What if you were much richer, or much poorer?
I’m confident that if you were to analyze when you torrent vs. when you buy, you’d notice trends that, with a bit of effort, could be translated into a fairly reasonable “Will I Torrent or Buy?” function that predicts whether you’ll torrent or not with much better accuracy than random.
I’m confident that if you were to analyze when you torrent vs. when you buy, you’d notice trends that, with a bit of effort, could be translated into a fairly reasonable “Will I Torrent or Buy?” function that predicts whether you’ll torrent or not with much better accuracy than random.
Yes, but the function might all include terms for things like how rude were Yvain’s co-workers to Yvain that day, what mood was Yvain in that day, was Yvain hungry at the moment, i.e., stuff a reasonably behaved utility function shouldn’t have terms for but the outcome of a willpower based struggle very well might.
So, when I agonize over whether to torrent an expensive album instead of paying for it, and about half the time I end up torrenting it and feeling bad, and about half the time I pay for it but don’t enjoy doing so … what exactly am I doing in the latter case if not employing willpower?
I mean, I know willpower probably isn’t a real thing on the deepest levels of the brain, but it’s fake in the same way centrifugal force is fake, not in the way Bigfoot is fake. It sure feels like I’m using willpower when I make moral decisions about pirating, and I don’t understand how your model above interprets that.
Granted, there are many other moral decisions I make that don’t require willpower and do conform to your model above, and if I had to choose black-and-white between ethics-as-willpower or ethics-as-choice I’d take the latter, your model just doesn’t seem complete.
My interpretation of the post in this case is: it’s not that you’re not employing willpower, instead you’re not employing personal morality. So, while TORRENT vs BUY fits into the societal ethics view, it does not fit into your personal morality.
From the personal morality perspective, the bad feeling you get is the thing you need willpower to fight against/suppress. You probably also need willpower to fight against/suppress the bad feeling you might be getting from buying the album. These need not be mutually exclusive. Personal morality can be both against torrenting and against spending money unduly.
Let me rephrase my objection, then.
I feel a certain sense of mental struggle when considering whether to torrent music. I don’t feel this same sense of mental struggle when considering whether or not to murder or steal or cheat . Although both of these are situations that call my personal morality, the torrenting situation seems to be an interesting special case.
We need a word to define the way in which the torrenting situation is a special case and not just another case where I don’t murder or steal or cheat because I’m not that kind of person. The majority of the English-speaking world seems to use “willpower”. As far as I know there’s no other definition of willpower, where we could say “Oh, that’s real willpower, this torrenting thing is something else.” If we didn’t have the word “willpower”, we’d have to make up a different word, like “conscious-alignment in mental struggle” or something.
So why not use the word “willpower” here?
Suppose that you have one extra ticket to the Grand Galloping Gala, and you have several friends who each want it desperately. You can give it to only one of them. Doesn’t the agonizing over that decision feel a lot like the agonizing over whether to buy or torrent? Yet we don’t think of that as involving willpower.
At the risk of totally reducing this to unsupportable subjective intuitions...no, the two decisions wouldn’t feel the same at all.
I can think of some cases in which it would feel similar. If one of the ticket-seekers was my best friend whom I’d known forever, and another was a girl I was trying to impress, and I had to decide between loyalty to my best friend or personal gain from impressing the girl. Or if one of the ticket-seekers had an incurable disease and this was her last chance to enjoy herself, and the other was a much better friend and much more fun to be around. But both of these are, in some way, moral issues.
In the simple ticket-seeker case without any of these complications, there would be a tough decision, but it would be symmetrical: there would be certain reasons for choosing Friend A, and certain others for choosing Friend B, and I could just decide between them. In the torrenting case, and the complicated ticket-seeker cases, it feels asymmetrical: like I have a better nature tending toward one side, and temptation tending toward the other side. This asymmetry seems to be the uniting factor behind my feeling of needing “willpower” for some decisions.
Mm.
So, OK, to establish some context first: one (ridiculously oversimplified) way of modeling situations like this is to say that in both cases I have two valuation functions, F1 and F2, which give different results when comparing the expected value of the two choices, because they weight the relevant factors differently (for example, the relative merits of being aligned with my better nature and giving in to temptation, or the relative merits of choosing friend A and friend B), but in the first (simple) case the two functions are well-integrated and I can therefore easily calculate the weighted average of them, and in the second (complicated) case the two functions are poorly integrated and averaging their results is therefore more difficult. So by the time the results become available to consciousness in the first case, I’ve already made the decision, so I feel like I can “just decide” whereas in the second case, I haven’t yet, and therefore feel like I have a difficult decision to make, and the difference is really one of how aware I am of the decision. (There are a lot of situations like this, where when an operation can be performed without conscious monitoring it “feels easy.”)
So. In both cases the decision is “asymmetrical,” in that F1 and F2 are different functions, but in the torrenting case (and the complicated ticket case), the difference between F1 and F2 is associated with a moral judgment (leading to words like “better nature” and “temptation”). Which feels very significant, because we’re wired to attribute significance to moral judgments.
I wonder how far one could get by treating it like any other decision procedure, though. For example, if I decide explicitly that I weight “giving into temptation” and “following my better nature” with a ratio of 1:3, and I flip coins accordingly to determine whether to torrent or not (and adjust my weighting over time if I’m not liking the overall results)… do I still need so much “willpower”?
I love the idea of the coin-flip diet. Although it can be gamed by proposing to eat things more often. Maybe you could roll a 6-sided die for each meal. 1 = oatmeal and prune juice, 2-3 = lentil soup, 4-5 = teriyaki chicken, 6 = Big Mac or ice cream.
If you know the weight, and you have a way of sorting the things you would flip a coin for, you can use the sorting order instead. For instance, I typically buy rather than torrent if the artist is in the bottom half of artists sorted by income.
I diet more or less this way. Not a coinflip, but a distribution that seems sustainable in the long term. Lentil soup twice a week, Big Mac and ice cream once a week, so to speak.
Or, if I wanted to choose between a car with good gas mileage and one with good performance, that could seem moral. Or if I were choosing between a food high in sugar, or one high in protein. Or one high in potassium, or one high in calcium.
What’s an example of an amoral choice?
Choosing between two cars with equally good gas mileage and performance, one which has more trunk space and one which has a roof rack.
It all depends on why you decide to torrent/not torrent:
Are you more likely to torrent if the album is very expensive, or if it is very cheap? If you expect it to be of high quality, or of low quality? If the store you could buy the album at is far away, or very close? If you like the band that made it, or if you don’t like them? Longer albums or shorter? Would you torrent less if the punishment for doing so was increased? Would you torrent more if it was harder to get caught? What if you were much richer, or much poorer?
I’m confident that if you were to analyze when you torrent vs. when you buy, you’d notice trends that, with a bit of effort, could be translated into a fairly reasonable “Will I Torrent or Buy?” function that predicts whether you’ll torrent or not with much better accuracy than random.
Yes, but the function might all include terms for things like how rude were Yvain’s co-workers to Yvain that day, what mood was Yvain in that day, was Yvain hungry at the moment, i.e., stuff a reasonably behaved utility function shouldn’t have terms for but the outcome of a willpower based struggle very well might.
I’m sure that’s true, but what relevance does that have to the current discussion?