I said ‘something akin to Virtue Points’, because I agree that someone getting hit is not actually more virtuous than someone not getting hit. I can understand why you would be very surprised if I thought that.
I think perhaps the whole post could be rewritten and framed in terms of suffering (or pain, or something of that nature), because I think that’s essentially what I’m getting at, and I feel it might be what Scott is getting at as well. I think it’s a highly common intuition that suffering is bad, and people often think that those who suffer deserve some kind of compensation, regardless of whether it was voluntary or not.
For example, say I have the following options:
A) Give a meal to a starving child.
B) Give something equally valuable to a healthy, non-starving child (note that obviously ‘something equally valuable’ doesn’t mean ‘a meal’ in this case, because a meal is a lot less valuable to a non-starving child than to a starving child. It’d probably have to be something more expensive than a meal.)
I’ve tried to define this such that, from a utilitarian perspective, there’s no difference between choosing option A and choosing option B.
I’d still rather choose A, because even though I know the Utility Points from both A and B are equal, there’s something about balancing out past suffering that makes me feel nice and fuzzy inside, and gives me a sense of justice. I expect this sense of justice is quite common, probably very common.
I should say that I think my post generally should not change the behaviour of people who hold strongly utilitarian views. But I think that even those who would consider themselves staunch utilitarians still possess to some degree these evolved intuitions about virtue and suffering, and to the extent that they do, I feel like it’d be nice (and probably valuable) for them (and everyone else) to be assigning their mental Virtue Points in ways that make more sense and are fairer.
In defining A and B as equally valuable, I have to equate the two. That said, it’s hard to imagine something that exists that would be as valuable to a non-starving healthy child right now as the meal to the starving child, so if the valuable thing you gave in scenario b were at all marketable, the inefficiency of choosing to use it to help b instead of the x (where x > 1) starving children would make the real psuedo-equation:
I said ‘something akin to Virtue Points’, because I agree that someone getting hit is not actually more virtuous than someone not getting hit. I can understand why you would be very surprised if I thought that.
I think perhaps the whole post could be rewritten and framed in terms of suffering (or pain, or something of that nature), because I think that’s essentially what I’m getting at, and I feel it might be what Scott is getting at as well. I think it’s a highly common intuition that suffering is bad, and people often think that those who suffer deserve some kind of compensation, regardless of whether it was voluntary or not.
For example, say I have the following options:
A) Give a meal to a starving child.
B) Give something equally valuable to a healthy, non-starving child (note that obviously ‘something equally valuable’ doesn’t mean ‘a meal’ in this case, because a meal is a lot less valuable to a non-starving child than to a starving child. It’d probably have to be something more expensive than a meal.)
I’ve tried to define this such that, from a utilitarian perspective, there’s no difference between choosing option A and choosing option B.
I’d still rather choose A, because even though I know the Utility Points from both A and B are equal, there’s something about balancing out past suffering that makes me feel nice and fuzzy inside, and gives me a sense of justice. I expect this sense of justice is quite common, probably very common.
I should say that I think my post generally should not change the behaviour of people who hold strongly utilitarian views. But I think that even those who would consider themselves staunch utilitarians still possess to some degree these evolved intuitions about virtue and suffering, and to the extent that they do, I feel like it’d be nice (and probably valuable) for them (and everyone else) to be assigning their mental Virtue Points in ways that make more sense and are fairer.
In defining A and B as equally valuable, I have to equate the two. That said, it’s hard to imagine something that exists that would be as valuable to a non-starving healthy child right now as the meal to the starving child, so if the valuable thing you gave in scenario b were at all marketable, the inefficiency of choosing to use it to help b instead of the x (where x > 1) starving children would make the real psuedo-equation:
utility(a) = utility(b)
cost(a) = x * cost(b)
if x > 1, do a, if x < 1 do b