I wouldn’t allow a mob of 100,000 to kill another human no matter how much they wanted to and even if their quality of life was improved (up to a point).
Be careful about statements like this due to scope insensitivity. Can you really understand the collective mental effects of the desires of 100 000? Unless you have used some math in coming to this conclusion, your opinion is unlikely to be correlated with reality.
Unless you have used some math in coming to this conclusion, your opinion is unlikely to be correlated with reality.
The ‘up to a point’ allows for potential consequentialist exceptions for when there may be sufficient instrumental value in allowing the killing despite the aversion. That does not lessen the validity of directly valuing preventing the murder of one based on the ‘right to freedom’ of 100,000 people. For that conclusion the only relevant evidence is, in fact, any information which indicates what his preferences really are.
I don’t quite see what you mean. How would he know what his preferences are in this case without doing math? Why is 100 000 the point at which the ‘right to freedom’ balances the taking of a life?
Why is 100 000 the point at which the ‘right to freedom’ balances the taking of a life?
Ahh, I now see why you are so adamant that his beliefs could not be related to reality. But I note that you are assuming that 100,000 is the turning point. It is not presented as a turning point, just as an example of a figure which would (usually) not qualify.
It takes limited math to eyeball “much greater than 100,000, except with extreme extenuating circumstances”. (Trying to apply more specific numeric operations to the judgement is usually bogus. We don’t have enough information to come to a more specific conclusion.)
I also note that it would even be legitimate to have preferences in which the “No Lynching!” rule isn’t even subject to roughly estimated math. Not everyone is a consequentialist (even if I would prefer it that they were, the short sighted potentially-universe-sacrificing fiends!)
Even in a non-consequentialist morality, the way that the number was presented implied that two things were being balanced in some way. It is extremely unlikely for a human to have a number like 100 000 just built into their value system, even if it would be internally consistent.
Be careful about statements like this due to scope insensitivity. Can you really understand the collective mental effects of the desires of 100 000? Unless you have used some math in coming to this conclusion, your opinion is unlikely to be correlated with reality.
I assume you oppose the death penalty, then?
The ‘up to a point’ allows for potential consequentialist exceptions for when there may be sufficient instrumental value in allowing the killing despite the aversion. That does not lessen the validity of directly valuing preventing the murder of one based on the ‘right to freedom’ of 100,000 people. For that conclusion the only relevant evidence is, in fact, any information which indicates what his preferences really are.
I don’t quite see what you mean. How would he know what his preferences are in this case without doing math? Why is 100 000 the point at which the ‘right to freedom’ balances the taking of a life?
Ahh, I now see why you are so adamant that his beliefs could not be related to reality. But I note that you are assuming that 100,000 is the turning point. It is not presented as a turning point, just as an example of a figure which would (usually) not qualify.
It takes limited math to eyeball “much greater than 100,000, except with extreme extenuating circumstances”. (Trying to apply more specific numeric operations to the judgement is usually bogus. We don’t have enough information to come to a more specific conclusion.)
I also note that it would even be legitimate to have preferences in which the “No Lynching!” rule isn’t even subject to roughly estimated math. Not everyone is a consequentialist (even if I would prefer it that they were, the short sighted potentially-universe-sacrificing fiends!)
I agree with all this.
Even in a non-consequentialist morality, the way that the number was presented implied that two things were being balanced in some way. It is extremely unlikely for a human to have a number like 100 000 just built into their value system, even if it would be internally consistent.