I assert the use of 3^^^3 in a moral argument is to avoid the effort of multiplying.
Yes, that’s what I said. If the quantities were close enough to have to multiply, the case would be open for debate even to utilitarians.
Demonstration: what is 3^^^3 times 6?
3^^^3, or as close as makes no difference.
What is 3^^^3 times a trillion to the trillionth power?
3^^^3, or as close as makes no difference.
...that’s kinda the point.
So it seems you have two intuitions. One is that you like certain kinds of “feel good” feedback that aren’t necessarily mathematically proportional to the quantifiable consequences. Another is that you like mathematical proportionality.
Er, no. One intuition is that I like to save lives—in fact, as many lives as possible, as reflected by my always preferring a larger number of lives saved to a smaller number. The other “intuition” is actually a complex compound of intuitions, that is, a rational verbal judgment, which enables me to appreciate that any non-aggregative decision-making will fail to lead to the consequence of saving as many lives as possible given bounded resources to save them.
I’m feeling a bit of despair here… it seems that no matter how I explain that this is how you have to plan if you want the plans to work, people just hear, “You like neat mathematical symmetries.” Optimal plans are neat because optimality is governed by laws and the laws are math—it has nothing to do with liking neatness.
50 years of being tortured is not (50 years 365 days 24 hours * 3600 seconds)-times worse than 1-second of torture. It is much (non-linearly) worse than that.
Utilitarianism does not assume that multiple experiences to the same person aggregate linearly.
Yes, I agree that it is non-linearly worse.
It is not infinitely worse. Just non-linearly worse.
The non-linearity factor is nowhere within a trillion to the trillionth power galaxies as large as 3^^^3.
If it were, no human being would ever think about anything except preventing torture or goals of similar importance. You would never take a single moment to think about putting an extra pinch of salt in your soup, if you felt a utility gradient that large. For that matter, your brain would have to be larger than the observable universe to feel a gradient that large.
I do not think people understand the largeness of the Large Number here.
There really are people who find utility in torture, whether because it makes them feel better to do it, or because they believe that torture helps maintain a social order that they prefer to any alternatives they can imagine.
Well, I didn’t downvote, but I think your comment misses the point. The discussion is about the requirements of a sane metaethics, specifically that you can’t have both “mundane things which are still worth trying to achieve or avoid” and “sacred things that can never be compromised for the sake of mundane things”. You have to treat them both quantitatively, or else never make a decision on the basis of mundane things, or else be inconsistent; and so there really has to be a point where a larger probability (or preponderance) of a “mundane” consideration outweighs a tiny probability (or preponderance) of a “sacred” one.
The precise degree to which torture is awful is kind of irrelevant to this debate. Pick another example if this one doesn’t work for you.
I assert the use of 3^^^3 in a moral argument is to avoid the effort of multiplying.
Yes, that’s what I said. If the quantities were close enough to have to multiply, the case would be open for debate even to utilitarians.
Demonstration: what is 3^^^3 times 6?
3^^^3, or as close as makes no difference.
What is 3^^^3 times a trillion to the trillionth power?
3^^^3, or as close as makes no difference.
...that’s kinda the point.
So it seems you have two intuitions. One is that you like certain kinds of “feel good” feedback that aren’t necessarily mathematically proportional to the quantifiable consequences. Another is that you like mathematical proportionality.
Er, no. One intuition is that I like to save lives—in fact, as many lives as possible, as reflected by my always preferring a larger number of lives saved to a smaller number. The other “intuition” is actually a complex compound of intuitions, that is, a rational verbal judgment, which enables me to appreciate that any non-aggregative decision-making will fail to lead to the consequence of saving as many lives as possible given bounded resources to save them.
I’m feeling a bit of despair here… it seems that no matter how I explain that this is how you have to plan if you want the plans to work, people just hear, “You like neat mathematical symmetries.” Optimal plans are neat because optimality is governed by laws and the laws are math—it has nothing to do with liking neatness.
50 years of being tortured is not (50 years 365 days 24 hours * 3600 seconds)-times worse than 1-second of torture. It is much (non-linearly) worse than that.
Utilitarianism does not assume that multiple experiences to the same person aggregate linearly.
Yes, I agree that it is non-linearly worse.
It is not infinitely worse. Just non-linearly worse.
The non-linearity factor is nowhere within a trillion to the trillionth power galaxies as large as 3^^^3.
If it were, no human being would ever think about anything except preventing torture or goals of similar importance. You would never take a single moment to think about putting an extra pinch of salt in your soup, if you felt a utility gradient that large. For that matter, your brain would have to be larger than the observable universe to feel a gradient that large.
I do not think people understand the largeness of the Large Number here.
You’ve left out that some people find utility in torture.
And also (I think this with somewhat less certainty) made a claim that people shouldn’t care about their own quality of life if they’re utilitarians.
Why the downvotes?
There really are people who find utility in torture, whether because it makes them feel better to do it, or because they believe that torture helps maintain a social order that they prefer to any alternatives they can imagine.
Well, I didn’t downvote, but I think your comment misses the point. The discussion is about the requirements of a sane metaethics, specifically that you can’t have both “mundane things which are still worth trying to achieve or avoid” and “sacred things that can never be compromised for the sake of mundane things”. You have to treat them both quantitatively, or else never make a decision on the basis of mundane things, or else be inconsistent; and so there really has to be a point where a larger probability (or preponderance) of a “mundane” consideration outweighs a tiny probability (or preponderance) of a “sacred” one.
The precise degree to which torture is awful is kind of irrelevant to this debate. Pick another example if this one doesn’t work for you.
I can’t work it out either.
You might be replying to someone else.
Copy and paste fail. “Why the downvotes?” is what I could not work out.