This is a pretty obvious point, which is sort of made here but I haven’t seen anyone lay out explicitly.
Under average utilitarianism the morality of having a child depends on whether a billion light years away there’s any sentient aliens, how many there are, and whether their average happiness is greater or less than your child’s would be. This despite your actions having zero impact on them, and vice versa.
This, far more than the sadistic conclusion, seems to me to sound the death knell of average utilitarianism. If your formalisation of utilitarianism both completely diverges from intuition in a straightforward situation (should I have kids), and is incalculable not just practically, but even in theory (it requires knowing the contents of the whole universe), what’s the point?
Average utilitarianism is non-local
This is a pretty obvious point, which is sort of made here but I haven’t seen anyone lay out explicitly.
Under average utilitarianism the morality of having a child depends on whether a billion light years away there’s any sentient aliens, how many there are, and whether their average happiness is greater or less than your child’s would be. This despite your actions having zero impact on them, and vice versa.
This, far more than the sadistic conclusion, seems to me to sound the death knell of average utilitarianism. If your formalisation of utilitarianism both completely diverges from intuition in a straightforward situation (should I have kids), and is incalculable not just practically, but even in theory (it requires knowing the contents of the whole universe), what’s the point?