The first is entirely up to you. The second are worth 0.0001ω, 0.01ω, and .99ω, respectively, and are still larger than any secular value. This is working as planned, as far as I’m concerned...
This is working as planned, as far as I’m concerned...
Are you saying that any odds of your request causing Frank’s death, no matter how small, are unacceptable? Then you will never be able to ask for anything.
Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)
The first is entirely up to you. The second are worth 0.0001ω, 0.01ω, and .99ω, respectively, and are still larger than any secular value. This is working as planned, as far as I’m concerned...
Are you saying that any odds of your request causing Frank’s death, no matter how small, are unacceptable? Then you will never be able to ask for anything.
Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Well, the usual utilitarian “torture wins” does not have this particular problem, it trades it for the repugnant conclusion “torture wins”.
Anyway, I don’t see how you approach avoids any of the standard pitfalls of utilitarianism, though it might be masking some.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)