What of exponential total utilitarianism? That’s a total utilitarianism that multiplies the total utility by the exponential of the population. It may be very unlikely, but as population grows, it will eventually come to dominate.
That’s why I think moral theories should be normalised independently, to prevent the super-population ones from winning just by default.
Therefore if total utilitarianism is not heavily weighted, it will likely remain unimportant; your phrasing “or someone whose moral uncertainty includes total utilitarianism” suggested to me that you thought total utilitarianism would be important even if assigned a low weight, which suggested that it was not being normalised.
your phrasing “or someone whose moral uncertainty includes total utilitarianism” suggested to me that you thought total utilitarianism would be important even if assigned a low weight, which suggested that it was not being normalised.
Ok, I didn’t mean that. What I meant was that if your moral uncertainty includes total utilitarianism, then the total utilitarian part should reason as follows. Would it be clearer / clear enough if I replaced “or someone whose moral uncertainty includes total utilitarianism” with “or the total utilitarianism part of someone’s moral uncertainty”?
What of exponential total utilitarianism? That’s a total utilitarianism that multiplies the total utility by the exponential of the population. It may be very unlikely, but as population grows, it will eventually come to dominate.
That’s why I think moral theories should be normalised independently, to prevent the super-population ones from winning just by default.
I’m assuming this as well. Did I give a different impression in the post? If so I’ll try to clarify.
Normally when I normalise, I use the expected maximum of the utility function if we just maximised it and nothing else: https://www.lesswrong.com/posts/hBJCMWELaW6MxinYW/intertheoretic-utility-comparison
Therefore if total utilitarianism is not heavily weighted, it will likely remain unimportant; your phrasing “or someone whose moral uncertainty includes total utilitarianism” suggested to me that you thought total utilitarianism would be important even if assigned a low weight, which suggested that it was not being normalised.
Ok, I didn’t mean that. What I meant was that if your moral uncertainty includes total utilitarianism, then the total utilitarian part should reason as follows. Would it be clearer / clear enough if I replaced “or someone whose moral uncertainty includes total utilitarianism” with “or the total utilitarianism part of someone’s moral uncertainty”?
I think that would be clearer, yes.
Thanks, I’ve made that edit.