As Eliezer pointed out, if it’s fairness, then you probably have a curved but continuous utility function—and with the numbers involved, it has to be a curve specifically tailored to the example.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
This comment talked about sublinear aggregation, but there’s a global variable (the temperature of the, um, globe). Swimmer963 is talking about personally choosing specks and then guessing that most people would behave the same. Total disutility is higher, but no one catches on fire.
If I was forced to choose between two possible events, and if killing people for organs had no unintended consequences, I’d go with the utilitarian cases, with a side order of a severe permanent guilt complex.
On the other hand, if I were asked to accept the personal benefit, I would behave the same as Swimmer963 and with similar expectations. Interestingly, if people are similar enough that TDT applies, my personal decisions become normative. There’s no moral dilemma in the case of torture vs specks, though, since choosing torture would result in extreme psychological distress times 3^^^3.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
When Eliezer wrote,
While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant. In other words it has to be effectively flat.
I am taking the inferential step that he was responding to everyone who appealed to non-linear aggregation, including those who just said “we value fairness” without saying or knowing that a technical way of saying that was “we compute a sum over all individuals i of f(utility(i)), where f is convex.”
and with the numbers involved, it has to be a curve specifically tailored to the example.
A pure utilitarian who could grasp a number as large as 3^^^3 might choose the one person being tortured. My point was that intuitively, the unfairness of torture jumps out more than the huge, huge number of people being minorly annoyed.
Maybe fairness as an intuition is more a flaw than a value. That’s actually an interesting thought. I’m going to ponder that now for a while.
My own feeling is that fairness as an intuition is very useful in small groups but starts to break down as the group gets larger. Which is what I would expect for an intuition that evolved in the context of small groups.
As Eliezer pointed out, if it’s fairness, then you probably have a curved but continuous utility function—and with the numbers involved, it has to be a curve specifically tailored to the example.
Where did Eliezer talk about fairness? I can’t find it in the original two threads.
This comment talked about sublinear aggregation, but there’s a global variable (the temperature of the, um, globe). Swimmer963 is talking about personally choosing specks and then guessing that most people would behave the same. Total disutility is higher, but no one catches on fire.
If I was forced to choose between two possible events, and if killing people for organs had no unintended consequences, I’d go with the utilitarian cases, with a side order of a severe permanent guilt complex.
On the other hand, if I were asked to accept the personal benefit, I would behave the same as Swimmer963 and with similar expectations. Interestingly, if people are similar enough that TDT applies, my personal decisions become normative. There’s no moral dilemma in the case of torture vs specks, though, since choosing torture would result in extreme psychological distress times 3^^^3.
When Eliezer wrote,
I am taking the inferential step that he was responding to everyone who appealed to non-linear aggregation, including those who just said “we value fairness” without saying or knowing that a technical way of saying that was “we compute a sum over all individuals i of f(utility(i)), where f is convex.”
A pure utilitarian who could grasp a number as large as 3^^^3 might choose the one person being tortured. My point was that intuitively, the unfairness of torture jumps out more than the huge, huge number of people being minorly annoyed.
Maybe fairness as an intuition is more a flaw than a value. That’s actually an interesting thought. I’m going to ponder that now for a while.
My own feeling is that fairness as an intuition is very useful in small groups but starts to break down as the group gets larger. Which is what I would expect for an intuition that evolved in the context of small groups.