The tiered values approach appears to run into continuity troubles, even with surreal numbers.
Seelie 2.0 double checks with its mental copy of your values, finds that you would rather have Frank’s life than infinite Fun, and assigns it a tier somewhere in between—for simplicity, let’s say that it puts it in the tier. And having done so, it correctly refuses Omega’s offer.
How does it compare punching/severely injuring/torturing Frank with your pile of cushions or with infinite fun? What if there is a .0001%/1%/99% probability that Frank will die?
The first is entirely up to you. The second are worth 0.0001ω, 0.01ω, and .99ω, respectively, and are still larger than any secular value. This is working as planned, as far as I’m concerned...
This is working as planned, as far as I’m concerned...
Are you saying that any odds of your request causing Frank’s death, no matter how small, are unacceptable? Then you will never be able to ask for anything.
Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)
The tiered values approach appears to run into continuity troubles, even with surreal numbers.
How does it compare punching/severely injuring/torturing Frank with your pile of cushions or with infinite fun? What if there is a .0001%/1%/99% probability that Frank will die?
The first is entirely up to you. The second are worth 0.0001ω, 0.01ω, and .99ω, respectively, and are still larger than any secular value. This is working as planned, as far as I’m concerned...
Are you saying that any odds of your request causing Frank’s death, no matter how small, are unacceptable? Then you will never be able to ask for anything.
Yes. See: Flaws. This is Pascal’s Mugging; it shows up in real systems too, you need a slightly more unlikely set-up but it’s still a plausible scenario. It’s not a problem the real utility system doesn’t have.
Well, the usual utilitarian “torture wins” does not have this particular problem, it trades it for the repugnant conclusion “torture wins”.
Anyway, I don’t see how you approach avoids any of the standard pitfalls of utilitarianism, though it might be masking some.
Surreal Utilities can support that conclusion as well: how you decide on Torture v. Dust Specks depends entirely on your choice of tiers.
I’m talking purely about Pascal’s Mugging, where someone shows up and says “I’ll save 3^^^3 lives if you give me five dollars.” This is isomorphic to this problem on the surreals, where someone says “I’ll give you omega-utility (save a life) at a probability of one in one quadrillion.)