I agree with what you’re saying; the reason I used trillions was exactly because it’s an amount nobody has. Any being which can produce a trillion dollars on the spot is likely (more than 50%, is my guess) powerful enough to produce two trillion dollars, while the same cannot be said for billions.
As for expected utility vs expected payoff, I agree that under conditions of diminishing marginal utility the offer is almost never worth taking. I am perhaps a bit too used to the more absurd versions of Pascal’s Mugging, where the mugger promises to grant you utility directly, or disutility in the form of a quadrillion years of torture.
Probably the intuition against accepting the money offer does indeed lie in diminishing marginal utility, but I find it interesting that I’m not tempted to take the offer even if it’s stated in terms of things with constant marginal utility to me, like lives saved or years of torture prevented.
I find it interesting that I’m not tempted to take the offer even if it’s stated in terms of things with constant marginal utility to me, like lives saved or years of torture prevented.
My instant response is that this strongly suggests that lives saved and years of torture prevented do not in fact have constant marginal utility to you. Or more specifically, the part of you that is in control of your intuitive reactions. I share your lack of temptation to take the offer.
My explanations are either or both of the following:
My instinctive sense of “altruistic temptation” is badly designed and makes poor choices in these scenarios, or else I am not as altruistic as I like to think.
My intuition for whether Pascalian Muggings are net positive expected value is correctly discerning that they are not, no matter the nature of the promised reward. Even in the case of an offer of increasing amounts of utility (defined as “anything for which twice as much is always twice as good”), I can still think that the offer to produce it is less and less likely to pay off the more that is offered.
I agree with what you’re saying; the reason I used trillions was exactly because it’s an amount nobody has. Any being which can produce a trillion dollars on the spot is likely (more than 50%, is my guess) powerful enough to produce two trillion dollars, while the same cannot be said for billions.
As for expected utility vs expected payoff, I agree that under conditions of diminishing marginal utility the offer is almost never worth taking. I am perhaps a bit too used to the more absurd versions of Pascal’s Mugging, where the mugger promises to grant you utility directly, or disutility in the form of a quadrillion years of torture.
Probably the intuition against accepting the money offer does indeed lie in diminishing marginal utility, but I find it interesting that I’m not tempted to take the offer even if it’s stated in terms of things with constant marginal utility to me, like lives saved or years of torture prevented.
My instant response is that this strongly suggests that lives saved and years of torture prevented do not in fact have constant marginal utility to you. Or more specifically, the part of you that is in control of your intuitive reactions. I share your lack of temptation to take the offer.
My explanations are either or both of the following:
My instinctive sense of “altruistic temptation” is badly designed and makes poor choices in these scenarios, or else I am not as altruistic as I like to think.
My intuition for whether Pascalian Muggings are net positive expected value is correctly discerning that they are not, no matter the nature of the promised reward. Even in the case of an offer of increasing amounts of utility (defined as “anything for which twice as much is always twice as good”), I can still think that the offer to produce it is less and less likely to pay off the more that is offered.