Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)
Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Well, the usual utilitarian “torture wins” does not have this particular problem, it trades it for the repugnant conclusion “torture wins”.
Anyway, I don’t see how you approach avoids any of the standard pitfalls of utilitarianism, though it might be masking some.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)