You seem to assume we should endorse something like average utilitarianism. Bostrom and I consider total utilitarianism to be closer to the best moral framework. See Parfit’s writings if you want deep discussion of this topic.
Thanks! Just read some summaries of parfit. Do you know any literature that addresses this issue within the context of a) impacts to other species, or b) using artificial minds as the additional population? I assume the total utilitarianism theory assumes arbitrarily growing physical space for populations to expand into and would not apply to finite spaces or resources (I think I recall bostrom addressing that).
Reading up on parfit also made me realize that deep utopia really has prerequisites and you were right that it’s probably more readily understood by those with philosophy background. I didn’t really understand what he was saying about utilitarianism until just reading about parfit.
You seem to assume we should endorse something like average utilitarianism. Bostrom and I consider total utilitarianism to be closer to the best moral framework. See Parfit’s writings if you want deep discussion of this topic.
Thanks! Just read some summaries of parfit. Do you know any literature that addresses this issue within the context of a) impacts to other species, or b) using artificial minds as the additional population? I assume the total utilitarianism theory assumes arbitrarily growing physical space for populations to expand into and would not apply to finite spaces or resources (I think I recall bostrom addressing that).
Reading up on parfit also made me realize that deep utopia really has prerequisites and you were right that it’s probably more readily understood by those with philosophy background. I didn’t really understand what he was saying about utilitarianism until just reading about parfit.