I think the seeming contradiction can be broken and located on a more or less precisely defined point between the extremes. And this goes as follows:
Most humans don’t value all lives the same. The ancestral environment has made us to value people near us more than distant people. People more distant than the closest 150 (dunbars number) receive significantly less attention (and in some hunter gatherer communities the value may be actually zero or possibly even negative). Our modern society has allowed us to apply ethics and empathize will all beings and this flattens out the valuation of human lives. But only in the idealized case does it value all beings equally. Humans intuitively don’t.
So if you add people and redistribute most people even if they consciously accept equal valuation will be caught by the conclusion repugnant to their intuition. They don’t really value all those many beings equally after all. If you’d apply some transitional valuation function the math works out a bit differnt as each add-and-redistribute step adds a little bit less value—because the added being is more distant to you after all—even if only a very small bit.
I think this model is more honest to what humans really feel. Societies opinion may differ though.
I think the seeming contradiction can be broken and located on a more or less precisely defined point between the extremes. And this goes as follows:
Most humans don’t value all lives the same. The ancestral environment has made us to value people near us more than distant people. People more distant than the closest 150 (dunbars number) receive significantly less attention (and in some hunter gatherer communities the value may be actually zero or possibly even negative). Our modern society has allowed us to apply ethics and empathize will all beings and this flattens out the valuation of human lives. But only in the idealized case does it value all beings equally. Humans intuitively don’t.
So if you add people and redistribute most people even if they consciously accept equal valuation will be caught by the conclusion repugnant to their intuition. They don’t really value all those many beings equally after all. If you’d apply some transitional valuation function the math works out a bit differnt as each add-and-redistribute step adds a little bit less value—because the added being is more distant to you after all—even if only a very small bit.
I think this model is more honest to what humans really feel. Societies opinion may differ though.