Saying the utility can’t be that enormous does not rule out any objective facts: it just says I don’t care that much.
To say you don’t care that much is a claim of objective fact. People sometimes discover that they do very much care (or, if you like, change to begin to very much care) about something they did not before. For example, conversion to ethical veganism. You may say that you will never entertain enormous utility, and this claim may be true, but it is still an objective claim.
And how do you even know? No-one can exhibit their utility function, supposing they have one, nor can they choose it.
As I said, I concede that I would pay $100 for that probability of that result, if I cared enough about that result, but my best estimate of how much I care about that probability of that result is “too little to consider.” And I think that is currently the same for every other human being.
(Also, you consistently seem to be implying that “entertaining enormous utility” is something different from being willing to pay a meaningful price for small probability of something: but these are simply identical—asking whether I might objectively accept an enormous utility assignment is just the same thing as asking whether there might be some principles which would cause me to pay the price for the small probability.)
To say you don’t care that much is a claim of objective fact. People sometimes discover that they do very much care (or, if you like, change to begin to very much care) about something they did not before. For example, conversion to ethical veganism. You may say that you will never entertain enormous utility, and this claim may be true, but it is still an objective claim.
And how do you even know? No-one can exhibit their utility function, supposing they have one, nor can they choose it.
As I said, I concede that I would pay $100 for that probability of that result, if I cared enough about that result, but my best estimate of how much I care about that probability of that result is “too little to consider.” And I think that is currently the same for every other human being.
(Also, you consistently seem to be implying that “entertaining enormous utility” is something different from being willing to pay a meaningful price for small probability of something: but these are simply identical—asking whether I might objectively accept an enormous utility assignment is just the same thing as asking whether there might be some principles which would cause me to pay the price for the small probability.)