The world, it is made of atoms and such. Quarks. What ever is the lowest level, does not matter. Not of people, not of happiness, and not of dust specks. The hypothetical utility function that corresponds to people’s well being has to take at it’s input the data about contents of a region of space (including laws of physics), and process that, and it has to somehow identify happiness and suffering happening in that whole region.
This hypothetical function does not have a property that well being of a region of space is equal to sum of well being of it’s parts considered individually, for the obvious reasons that this won’t work for very tiny parts, or because you value your head as it is more than you value your head diced into 27 cubes and rearranged randomly like a Rubik’s cube—yet you can’t tell the difference if you consider those cubes individually.
The dustspecks vs torture reasoning had confused f(dustspeck, dustspeck, dustspeck… n times) with n*f(dustspeck) , which would only be valid if the above-mentioned property held.
There’s nothing inconsistent about having an utility function which returns the worst suffering it can find inside the region of space; it merely doesn’t approximate human morality very well—and neither does an utility function that merely sums well being of people.
This recent comment of mine seems relevant here.
The world, it is made of atoms and such. Quarks. What ever is the lowest level, does not matter. Not of people, not of happiness, and not of dust specks. The hypothetical utility function that corresponds to people’s well being has to take at it’s input the data about contents of a region of space (including laws of physics), and process that, and it has to somehow identify happiness and suffering happening in that whole region.
This hypothetical function does not have a property that well being of a region of space is equal to sum of well being of it’s parts considered individually, for the obvious reasons that this won’t work for very tiny parts, or because you value your head as it is more than you value your head diced into 27 cubes and rearranged randomly like a Rubik’s cube—yet you can’t tell the difference if you consider those cubes individually.
The dustspecks vs torture reasoning had confused f(dustspeck, dustspeck, dustspeck… n times) with n*f(dustspeck) , which would only be valid if the above-mentioned property held.
There’s nothing inconsistent about having an utility function which returns the worst suffering it can find inside the region of space; it merely doesn’t approximate human morality very well—and neither does an utility function that merely sums well being of people.