Heh, no I’m not saying that if X holds then ~X fails to hold.
I had a feeling you weren’t. :)
I’m saying that we (those of us who chose dust specks) have chosen to reject utilitarianism and proposing an alternative, since we can’t merely choose nonapples over apples.
Yes, that’s accurate. If you take utilitarianism to its logical conclusion, you reach things like Torture in T v. DS problems. This conversation reminds me a lot of the excellent book “The Limits of Morality.”
I’d be curious as to why anyone would choose to reject utilitarianism on the basis of this thought experiment, though.
Then it seems we’ve reached an agreement, as the agreement theorem says we should. And yes, this is a thought experiment, it is unlikely that anyone will ever have to choose between such extremes (or that 3^^^3 people will ever exist, at once or even in total). However, whether real or not, if one rejects utilitarianism here, they can’t simply say “Well it works in all real scenarios though”. Eliezer could have just as easily mentioned a utility monster, but he felt like conveying the same thought experiment in a more original way.
However, whether real or not, if one rejects utilitarianism here, they can’t simply say “Well it works in all real scenarios though”. Eliezer could have just as easily mentioned a utility monster, but he felt like conveying the same thought experiment in a more original way.
Right. I’m just unclear as to why people (not you specifically, I just meant it generally in my previous comment) interpret these kinds of stories as criticisms of utilitarianism. They are simply taking the axioms to their logical extremes, not offering arguments against accepting those axioms in the first place.
Ah, well if that’s the point you’re making then yes, you’re indeed correct. Eliezer has by no means argued that utilitarianism is entirely wrong, just shown that its logical extreme is wrong (which may or may not have been his intention). If you’re arguing that others are seeing this in a different way than we agreeably have, and have interpreted this article in a different way than is rational...well, you may also have a point there. It’s not particularly surprising though, since there are dozens (perhaps hundreds) of ways to succumb to 1 or more fallacies and only 1 way to succumb to none.
First of all, I am for the torture—so are 22.1% of the people recently surveyed vs 36.8% who are for the dust specks—the rest don’t want to respond or are unsure.
Secondly, the issue of small dispersed disutilities vs large concentrated ones is one we constantly encounter in the real world, and time after time society accepts that for the purpose of e.g. the convenience of driving, we can tolerate the unavoidable tradeoff of the occasional traffic accidents. Where we don’t sacrifice every tiny little luxury just to gather resources to save a single extra life. If you had to break 7 billion legs to save a single man from being tortured, most people would not accept this tradeoff as acceptable.
Once this logic is in place, all that remains is the scope insensitivity where people can’t really intuit the vast size of 3^^^3.
I had a feeling you weren’t. :)
Yes, that’s accurate. If you take utilitarianism to its logical conclusion, you reach things like Torture in T v. DS problems. This conversation reminds me a lot of the excellent book “The Limits of Morality.”
I’d be curious as to why anyone would choose to reject utilitarianism on the basis of this thought experiment, though.
Then it seems we’ve reached an agreement, as the agreement theorem says we should. And yes, this is a thought experiment, it is unlikely that anyone will ever have to choose between such extremes (or that 3^^^3 people will ever exist, at once or even in total). However, whether real or not, if one rejects utilitarianism here, they can’t simply say “Well it works in all real scenarios though”. Eliezer could have just as easily mentioned a utility monster, but he felt like conveying the same thought experiment in a more original way.
Right. I’m just unclear as to why people (not you specifically, I just meant it generally in my previous comment) interpret these kinds of stories as criticisms of utilitarianism. They are simply taking the axioms to their logical extremes, not offering arguments against accepting those axioms in the first place.
Ah, well if that’s the point you’re making then yes, you’re indeed correct. Eliezer has by no means argued that utilitarianism is entirely wrong, just shown that its logical extreme is wrong (which may or may not have been his intention). If you’re arguing that others are seeing this in a different way than we agreeably have, and have interpreted this article in a different way than is rational...well, you may also have a point there. It’s not particularly surprising though, since there are dozens (perhaps hundreds) of ways to succumb to 1 or more fallacies and only 1 way to succumb to none.
First of all, I am for the torture—so are 22.1% of the people recently surveyed vs 36.8% who are for the dust specks—the rest don’t want to respond or are unsure.
Secondly, the issue of small dispersed disutilities vs large concentrated ones is one we constantly encounter in the real world, and time after time society accepts that for the purpose of e.g. the convenience of driving, we can tolerate the unavoidable tradeoff of the occasional traffic accidents. Where we don’t sacrifice every tiny little luxury just to gather resources to save a single extra life. If you had to break 7 billion legs to save a single man from being tortured, most people would not accept this tradeoff as acceptable.
Once this logic is in place, all that remains is the scope insensitivity where people can’t really intuit the vast size of 3^^^3.