I don’t care about total utility, so arbitrarily many copies of myself with a worse life is strictly worse than one with a better life to me. The subjective experience of each one will be that they exchanged a better life for a worse one, and each one will be identical. I do not accept the offer. I think the morality of accepting this offer depends from person to person.
On the other hand, I think a lot of people would take this offer if they themselves were paid handsomely and did not have to become a serf, but their copies did.
It’s not clear what this means, and for reasonable guesses about that there seems to be no way for you to know the truth or falsity of this statement with significant certainty.
(Unless you mean that your emotional response or cached opinion is this way, which answers the original question to some extent, but in that case the specific phrase “I don’t care about total utility” seems to be pretending to be an additional argument that justifies the emotion/opinion, which it doesn’t seem to be doing.)
But what I mean is I do not see adding entities that slightly prefer being alive to dying as worth doing. I don’t think the total count of utility that exists is important. I value utility for existing entities. I would prefer a world of 10 thousand very happy people to 10 billion slightly happy people.
I don’t care about total utility, so arbitrarily many copies of myself with a worse life is strictly worse than one with a better life to me. The subjective experience of each one will be that they exchanged a better life for a worse one, and each one will be identical. I do not accept the offer. I think the morality of accepting this offer depends from person to person.
On the other hand, I think a lot of people would take this offer if they themselves were paid handsomely and did not have to become a serf, but their copies did.
It’s not clear what this means, and for reasonable guesses about that there seems to be no way for you to know the truth or falsity of this statement with significant certainty.
(Unless you mean that your emotional response or cached opinion is this way, which answers the original question to some extent, but in that case the specific phrase “I don’t care about total utility” seems to be pretending to be an additional argument that justifies the emotion/opinion, which it doesn’t seem to be doing.)
It’s an emotional claim, but not unthought about.
But what I mean is I do not see adding entities that slightly prefer being alive to dying as worth doing. I don’t think the total count of utility that exists is important. I value utility for existing entities. I would prefer a world of 10 thousand very happy people to 10 billion slightly happy people.