I am still confused about aspects of the torture vs specks problem. I’ll grant for this comment that I would be willing to choose torture for 1 person for 50 years to avoid a dust speck in the eye of 3^^^3 people. Numerically I’ll just assign −3^^^3 utilons to specks and −10^12 utilons to torture. Where confusion sets in is if I consider the possibility of a third form of disutility between the two extremes, for example paper cuts.
Suppose that 1 paper cut is −100 utilons and 50 years of torture is −10^12 utilons so the expected utility in either case is the same*. However, my personal preference would be to choose 10^10 papercuts over 50 years of torture. Similarly, if a broken bone is worth −10^4 utilons I would rather that the same 10^10 people got a papercut instead of only 10^8 people having a broken bone. The best case would be if I could avoid 3^^^3 specks in exchange for somewhat fewer than 3^^^3 just-barely-more-irritating specks, instead of torturing, breaking, or cutting anyone.
Therefore, maximizing average or total expected utility doesn’t seem to capture all my preferences. I think I can best describe it as choosing the maximum of the minimum individual utilities while still maximizing total or average utility. As such I am inclined to choose specks over torture, probably as a result of trying to find a more palatable tradeoff with broken bones or papercuts or slight-more-annoying specks. In real life there are usually compromises, unlike in hypotheticals. Still, I wonder if it might be acceptable (or even moral) to accept only 99% of the maximum possible utility if it allows significant maximin-ing of some otherwise very negative individual utilities.
*assume a universal population of 4^^^4 individuals and the roles are randomly selected so that utility isn’t affected by changing the total number of individuals.
I think this is one of the legitimate conclusions you should make from torture vs dust specks. It’s not that your intuition is necessarily wrong (though it may be) but that a simple multiplicative may NOT accurately describe your utility function. You can’t choose torture based on simple addition but that doesn’t necessarily mean choosing torture isn’t what you should do given your UF
I don’t think it’s the specifics of the multiplicative accumulation of individual utilities that matters; just imagine that however I calculate the utility of torture and papercuts there is some lottery where I am VNM-indifferent between 10^10 papercuts and torture for 50 years. 10^10 + 1 papercuts would be too much and I would opt for torture; 50 years + 1 second of torture would be too much and I would opt for papercuts. However, given the VNM-indifferent choice, I would still have a preference for papercuts over torture because it maximizes the minimum individual utility while still maximizing overall utility. (-10^12 utility is the minimum individual utility when choosing torture, −100 utility is the minimum individual utility when choosing papercuts, total utility is −10^12 either way, average utility is −10^12 / (10^10 + 1) either way, so I am fairly certain the latter two are indifferent between the choices. If I’ve just made a math error, that would help alleviate my confusion.).
To me, at least, it seems like this preference is not captured by utilitarianism using VNM-utility. I think it’s almost possible to explain it in terms of negative utilitarianism but I don’t actually try to minimize overall harm, just minimize the greatest individual harm while keeping total or average utility maximized (or sufficiently close to maximal).
Obviously you’re going to get wrong specific answers if you’re just pulling exponents out of thin air. The torture vs. specs example works because the answer would be the same if specs were worth the same as a year of torture or 10^-10 as much or 10^-1000 as much.
Getting approximate utilities is tricky; general practice is to come up with two situations you’re intuitively indifferent about, where one involves a small event, and the other involves a dice throw and then a big event only with a certain probability dependent on it. only AFTER you’ve come up with this kind of preference do you put number on anything, although often you’ll find this unnecessary as just thinking about it like this resolved your confusion.
I am still confused about aspects of the torture vs specks problem. I’ll grant for this comment that I would be willing to choose torture for 1 person for 50 years to avoid a dust speck in the eye of 3^^^3 people. Numerically I’ll just assign −3^^^3 utilons to specks and −10^12 utilons to torture. Where confusion sets in is if I consider the possibility of a third form of disutility between the two extremes, for example paper cuts.
Suppose that 1 paper cut is −100 utilons and 50 years of torture is −10^12 utilons so the expected utility in either case is the same*. However, my personal preference would be to choose 10^10 papercuts over 50 years of torture. Similarly, if a broken bone is worth −10^4 utilons I would rather that the same 10^10 people got a papercut instead of only 10^8 people having a broken bone. The best case would be if I could avoid 3^^^3 specks in exchange for somewhat fewer than 3^^^3 just-barely-more-irritating specks, instead of torturing, breaking, or cutting anyone.
Therefore, maximizing average or total expected utility doesn’t seem to capture all my preferences. I think I can best describe it as choosing the maximum of the minimum individual utilities while still maximizing total or average utility. As such I am inclined to choose specks over torture, probably as a result of trying to find a more palatable tradeoff with broken bones or papercuts or slight-more-annoying specks. In real life there are usually compromises, unlike in hypotheticals. Still, I wonder if it might be acceptable (or even moral) to accept only 99% of the maximum possible utility if it allows significant maximin-ing of some otherwise very negative individual utilities.
*assume a universal population of 4^^^4 individuals and the roles are randomly selected so that utility isn’t affected by changing the total number of individuals.
I think this is one of the legitimate conclusions you should make from torture vs dust specks. It’s not that your intuition is necessarily wrong (though it may be) but that a simple multiplicative may NOT accurately describe your utility function. You can’t choose torture based on simple addition but that doesn’t necessarily mean choosing torture isn’t what you should do given your UF
I don’t think it’s the specifics of the multiplicative accumulation of individual utilities that matters; just imagine that however I calculate the utility of torture and papercuts there is some lottery where I am VNM-indifferent between 10^10 papercuts and torture for 50 years. 10^10 + 1 papercuts would be too much and I would opt for torture; 50 years + 1 second of torture would be too much and I would opt for papercuts. However, given the VNM-indifferent choice, I would still have a preference for papercuts over torture because it maximizes the minimum individual utility while still maximizing overall utility. (-10^12 utility is the minimum individual utility when choosing torture, −100 utility is the minimum individual utility when choosing papercuts, total utility is −10^12 either way, average utility is −10^12 / (10^10 + 1) either way, so I am fairly certain the latter two are indifferent between the choices. If I’ve just made a math error, that would help alleviate my confusion.).
To me, at least, it seems like this preference is not captured by utilitarianism using VNM-utility. I think it’s almost possible to explain it in terms of negative utilitarianism but I don’t actually try to minimize overall harm, just minimize the greatest individual harm while keeping total or average utility maximized (or sufficiently close to maximal).
Obviously you’re going to get wrong specific answers if you’re just pulling exponents out of thin air. The torture vs. specs example works because the answer would be the same if specs were worth the same as a year of torture or 10^-10 as much or 10^-1000 as much.
Getting approximate utilities is tricky; general practice is to come up with two situations you’re intuitively indifferent about, where one involves a small event, and the other involves a dice throw and then a big event only with a certain probability dependent on it. only AFTER you’ve come up with this kind of preference do you put number on anything, although often you’ll find this unnecessary as just thinking about it like this resolved your confusion.