It’s entropy of the distribution, so you should at least bin it in some kind of histogram (or take the limit to a continuous approximation) to get something meaningful out of it. But agree on the magnitude of the difference, which does indeed matter in the Omelas example; the entropy for a system in which everyone has 100 except for one who has 99 would be the exact same. However, the average utility would be different, so the total “free utility” still wouldn’t be the same.
I think variance alone suffers from limits too, you’d need a function of the full series of moments of the distribution, that’s why I found entropy more potentially attractive. Perhaps the moment generating function for the distribution comes closer to the properties we’re interested in? In the thermodynamics analogy it also resembles a lot the partition function.
I think that the thing you want is probably to maximize
N*sum(u_i exp(-u_i/T))/sum(exp(-u_i/T))
or -log(sum(exp(-u_i/T)))
where u_i is the utility of the Ith person, and N is the number of people—not sure which.
That way you get in one limit the vail of ignorance for utility maximizers, and in the other limit the vail of ignorance of Roles (extreme risk aversion).
That way you also don’t have to treat the mean utility separately.
Well, that sure does look a lot like a utility partition function though. Not sure why use N if you’re also doing a sum though; that will already scale with the numbers of people. If anything,
1Z∑iuie−uiT
becomes the average utility in the limit of T→∞, whereas for T→0 you converge on maximising simply the single lowest utility around. If we write it as:
Z=∑ie−βui
then we’re maximizing −∂logZ∂β. It’s interesting though that despite the similarity this isn’t an actual partition function, because the individual terms don’t represent probabilities, more like weights. We’re basically discounting utility more and more the more you accumulate of it.
I agree with all of it. I think that I through the N there because average utilitarianism is super contra intuitive for me so I tried to make it total utility.
And also about the weights—to value equality is basically to weight the marginal happiness of the unhappy more than that of the already-happy. Or when behind the vail of ignorance, to consider yourself unlucky and therefore more likely to be born as the unhappy. Or what you wrote.
It’s entropy of the distribution, so you should at least bin it in some kind of histogram (or take the limit to a continuous approximation) to get something meaningful out of it. But agree on the magnitude of the difference, which does indeed matter in the Omelas example; the entropy for a system in which everyone has 100 except for one who has 99 would be the exact same. However, the average utility would be different, so the total “free utility” still wouldn’t be the same.
I think variance alone suffers from limits too, you’d need a function of the full series of moments of the distribution, that’s why I found entropy more potentially attractive. Perhaps the moment generating function for the distribution comes closer to the properties we’re interested in? In the thermodynamics analogy it also resembles a lot the partition function.
I think that the thing you want is probably to maximize N*sum(u_i exp(-u_i/T))/sum(exp(-u_i/T)) or -log(sum(exp(-u_i/T))) where u_i is the utility of the Ith person, and N is the number of people—not sure which. That way you get in one limit the vail of ignorance for utility maximizers, and in the other limit the vail of ignorance of Roles (extreme risk aversion).
That way you also don’t have to treat the mean utility separately.
Well, that sure does look a lot like a utility partition function though. Not sure why use N if you’re also doing a sum though; that will already scale with the numbers of people. If anything,
1Z∑iuie−uiT
becomes the average utility in the limit of T→∞, whereas for T→0 you converge on maximising simply the single lowest utility around. If we write it as:
Z=∑ie−βui
then we’re maximizing −∂logZ∂β. It’s interesting though that despite the similarity this isn’t an actual partition function, because the individual terms don’t represent probabilities, more like weights. We’re basically discounting utility more and more the more you accumulate of it.
I agree with all of it. I think that I through the N there because average utilitarianism is super contra intuitive for me so I tried to make it total utility.
And also about the weights—to value equality is basically to weight the marginal happiness of the unhappy more than that of the already-happy. Or when behind the vail of ignorance, to consider yourself unlucky and therefore more likely to be born as the unhappy. Or what you wrote.