(Just FYI, over the course of this discussion I have been gradually updating downward my confidence that you’re interested in being accurate and fair about total utilitarians, rather than merely slinging mud.)
I admit I have been using deliberately emotive descriptions, as I believe that total utilitarians have gradually disconnected themselves from the true consequences of their beliefs—the equivalent of those who argue that “maybe the world isn’t worth saving” while never dreaming of letting people they know or even random strangers just die in front of them.
you also have to consider their impact on others, and the impact on the whole society of all that killing-and-replacing.
Of course! But a true total utilitarian would therefore want to mould society (if they could) so that killing-and-replacing have less negative impact.
The scenario I suppose you need to imagine here is that we have machines for manufacturing fully-grown people, and they’ve gradually been getting better so that they produce better and happier and nicer and more productive people.
In a future where uploads and copying may be possible, this may not be so far fetched as it seems (and total resources are likely limited). That’s the only reason I care about this—there could be situations created in the medium future where the problematic aspects of total utilitarianism come to the fore. I’m not sure we can over-rely on practical considerations to keep these conclusions at bay.
I admit I have been using deliberately emotive descriptions, as I believe that total utilitarians have gradually disconnected themselves from the true consequences of their beliefs—the equivalent of those who argue that “maybe the world isn’t worth saving” while never dreaming of letting people they know or even random strangers just die in front of them.
Of course! But a true total utilitarian would therefore want to mould society (if they could) so that killing-and-replacing have less negative impact.
In a future where uploads and copying may be possible, this may not be so far fetched as it seems (and total resources are likely limited). That’s the only reason I care about this—there could be situations created in the medium future where the problematic aspects of total utilitarianism come to the fore. I’m not sure we can over-rely on practical considerations to keep these conclusions at bay.