Yes, this is the refutation for Pascal’s mugger that I believe in, although I never got around to writing it up like you did. However, I disagree with you that it implies that our utilities must be bounded. All the argument shows is that ordinary people never assign to events enourmous utility values with also assigning them commensuably low probabilities. That is, normative claims (i.e., claims that certain events have certain utility assigned to them) are judged fundamentally differently from factual claims, and require more evidence than merely the complexity prior. In a moral intuitionist framework this is the fact that anyone can say that 3^^^3 lives are suffering, but it would take living 3^^^3 years and getting to know 3^^^3 people personally to feel the 3^^^3 times utility associated with this events.
I don’t know how to distinguish the scenarios where our utilities are bounded and where our utilities are unbounded but regularized (or whether our utilities are suffiently well-defined to distinguish the two). Still, I want to emphasize that the latter situation is possible.
All the argument shows is that ordinary people never assign to events enourmous utility values with also assigning them commensuably low probabilities.
Under certain assumptions (which perhaps could be quibbled with), demanding that utility values can’t grow too fast for expected utility to be defined actually does imply that the utility function must be bounded.
Yes, this is the refutation for Pascal’s mugger that I believe in, although I never got around to writing it up like you did. However, I disagree with you that it implies that our utilities must be bounded. All the argument shows is that ordinary people never assign to events enourmous utility values with also assigning them commensuably low probabilities. That is, normative claims (i.e., claims that certain events have certain utility assigned to them) are judged fundamentally differently from factual claims, and require more evidence than merely the complexity prior. In a moral intuitionist framework this is the fact that anyone can say that 3^^^3 lives are suffering, but it would take living 3^^^3 years and getting to know 3^^^3 people personally to feel the 3^^^3 times utility associated with this events.
I don’t know how to distinguish the scenarios where our utilities are bounded and where our utilities are unbounded but regularized (or whether our utilities are suffiently well-defined to distinguish the two). Still, I want to emphasize that the latter situation is possible.
Under certain assumptions (which perhaps could be quibbled with), demanding that utility values can’t grow too fast for expected utility to be defined actually does imply that the utility function must be bounded.
I’m not sure what you mean by that.