Most people choose the many dust specks over the torture. Some people argued that “human values” includes having a utility aggregation function that rounds tiny (absolute value) utilities to zero, thus giving the “dust specks” answer. No, Eliezer said; this was an error in human reasoning. Is it an error, or a value?
I’m not sure. I think the answer most people give on this has more to do with fairness than rounding to zero. Yeah, it’s annoying for me to get a dust speck in my eye, but it’s unfair that someone should be tortured for 50 years just to spare me (and 3^^^3 others) from dust specks. I would choose getting a dust speck in my eye over someone else being tortured, and I think most people are similar enough to me that I can assume the same of the other 3^^^3 people.
As Eliezer pointed out, if it’s fairness, then you probably have a curved but continuous utility function—and with the numbers involved, it has to be a curve specifically tailored to the example.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
This comment talked about sublinear aggregation, but there’s a global variable (the temperature of the, um, globe). Swimmer963 is talking about personally choosing specks and then guessing that most people would behave the same. Total disutility is higher, but no one catches on fire.
If I was forced to choose between two possible events, and if killing people for organs had no unintended consequences, I’d go with the utilitarian cases, with a side order of a severe permanent guilt complex.
On the other hand, if I were asked to accept the personal benefit, I would behave the same as Swimmer963 and with similar expectations. Interestingly, if people are similar enough that TDT applies, my personal decisions become normative. There’s no moral dilemma in the case of torture vs specks, though, since choosing torture would result in extreme psychological distress times 3^^^3.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
When Eliezer wrote,
While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant. In other words it has to be effectively flat.
I am taking the inferential step that he was responding to everyone who appealed to non-linear aggregation, including those who just said “we value fairness” without saying or knowing that a technical way of saying that was “we compute a sum over all individuals i of f(utility(i)), where f is convex.”
and with the numbers involved, it has to be a curve specifically tailored to the example.
A pure utilitarian who could grasp a number as large as 3^^^3 might choose the one person being tortured. My point was that intuitively, the unfairness of torture jumps out more than the huge, huge number of people being minorly annoyed.
Maybe fairness as an intuition is more a flaw than a value. That’s actually an interesting thought. I’m going to ponder that now for a while.
My own feeling is that fairness as an intuition is very useful in small groups but starts to break down as the group gets larger. Which is what I would expect for an intuition that evolved in the context of small groups.
This matches my intuition on the subject. It also matches my intuition of the problem of Nozick’s Utility Monster. Yes, total utility will be maximized if we let the monster eat everyone, but it introduces a large disparity: huge utility for the monster, huge disutility for everyone else.
The question is, is this a “valid value” or a problem? The only way I can see to answer this is to ask if I would self-modify to not caring about fairness, and I don’t know the answer.
But remove human agency and imagine the torturer isn’t a person. Say you can remove a dust speck from your eye, but the procedure has a 1/3\^\^\^3 chance of failing and giving you injuries equivalent to torturing you for 50 years.
Now imagine 3\^\^\^3 make a similar choice. One of them will likely fail the procedure and get tortured.
I’m not sure. I think the answer most people give on this has more to do with fairness than rounding to zero. Yeah, it’s annoying for me to get a dust speck in my eye, but it’s unfair that someone should be tortured for 50 years just to spare me (and 3^^^3 others) from dust specks. I would choose getting a dust speck in my eye over someone else being tortured, and I think most people are similar enough to me that I can assume the same of the other 3^^^3 people.
As Eliezer pointed out, if it’s fairness, then you probably have a curved but continuous utility function—and with the numbers involved, it has to be a curve specifically tailored to the example.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
This comment talked about sublinear aggregation, but there’s a global variable (the temperature of the, um, globe). Swimmer963 is talking about personally choosing specks and then guessing that most people would behave the same. Total disutility is higher, but no one catches on fire.
If I was forced to choose between two possible events, and if killing people for organs had no unintended consequences, I’d go with the utilitarian cases, with a side order of a severe permanent guilt complex.
On the other hand, if I were asked to accept the personal benefit, I would behave the same as Swimmer963 and with similar expectations. Interestingly, if people are similar enough that TDT applies, my personal decisions become normative. There’s no moral dilemma in the case of torture vs specks, though, since choosing torture would result in extreme psychological distress times 3^^^3.
When Eliezer wrote,
I am taking the inferential step that he was responding to everyone who appealed to non-linear aggregation, including those who just said “we value fairness” without saying or knowing that a technical way of saying that was “we compute a sum over all individuals i of f(utility(i)), where f is convex.”
A pure utilitarian who could grasp a number as large as 3^^^3 might choose the one person being tortured. My point was that intuitively, the unfairness of torture jumps out more than the huge, huge number of people being minorly annoyed.
Maybe fairness as an intuition is more a flaw than a value. That’s actually an interesting thought. I’m going to ponder that now for a while.
My own feeling is that fairness as an intuition is very useful in small groups but starts to break down as the group gets larger. Which is what I would expect for an intuition that evolved in the context of small groups.
This matches my intuition on the subject. It also matches my intuition of the problem of Nozick’s Utility Monster. Yes, total utility will be maximized if we let the monster eat everyone, but it introduces a large disparity: huge utility for the monster, huge disutility for everyone else.
The question is, is this a “valid value” or a problem? The only way I can see to answer this is to ask if I would self-modify to not caring about fairness, and I don’t know the answer.
But remove human agency and imagine the torturer isn’t a person. Say you can remove a dust speck from your eye, but the procedure has a 1/3\^\^\^3 chance of failing and giving you injuries equivalent to torturing you for 50 years.
Now imagine 3\^\^\^3 make a similar choice. One of them will likely fail the procedure and get tortured.