In order for wager n to be nonnegative expected utility, P(death)*U_0 + (1-P(death))*U_(n+1) >= U_n.
Equivalently, P(death this time | survived until n) ⇐ (U_(n+1)-U_n) / (U_(n+1)-U0).
Assume the worst case, equality. Then the cumulative probability of survival decreases by exactly the same factor as your utility (conditioned on survival) increases. This is simple multiplication, so it’s true of a sequence of borderline wagers too.
With a bounded utility function, the worst sequence of wagers you’ll accept in total is P(death) ⇐ (U_max-U0)/(U1-U0). Which is exactly what you’d expect.
When there’s an infinite number of wagers, there can be a distinction between accepting the whole sequence at one go and accepting each wager one after another. (There’s a paradox associated with this distinction, but I forget what it’s called.) Your second-last sentence seems to be a conclusion about accepting the whole sequence at one go, but I’m worried about accepting each wager one after another. Is the distinction important here?
A bounded utility function probably gets you out of all problems along those lines.
Certainly it’s good in the particular case: your expected utility (in the appropriate sense) is an increasing function of bets you accept and increasing sequences don’t have convergence issues.
How would you bound your utility function? Just pick some arbitrary converging function f, and set utility’ = f(utility)? That seems arbitrary. I suspect it might also make theorems about expectation maximization break down.
No, I’m not advocating changing utility functions. I’m just saying that if your utility function is bounded, you don’t have either of these problems with infinity. You don’t have the convergence problem nor the original problem of probability of the good outcome going to zero. Of course, you still have the result that you keep making bets till your utility is maxed out with very low probability, which bothers some people.
There is no such sequence. Proof:
In order for wager n to be nonnegative expected utility, P(death)*U_0 + (1-P(death))*U_(n+1) >= U_n. Equivalently, P(death this time | survived until n) ⇐ (U_(n+1)-U_n) / (U_(n+1)-U0).
Assume the worst case, equality. Then the cumulative probability of survival decreases by exactly the same factor as your utility (conditioned on survival) increases. This is simple multiplication, so it’s true of a sequence of borderline wagers too.
With a bounded utility function, the worst sequence of wagers you’ll accept in total is P(death) ⇐ (U_max-U0)/(U1-U0). Which is exactly what you’d expect.
When there’s an infinite number of wagers, there can be a distinction between accepting the whole sequence at one go and accepting each wager one after another. (There’s a paradox associated with this distinction, but I forget what it’s called.) Your second-last sentence seems to be a conclusion about accepting the whole sequence at one go, but I’m worried about accepting each wager one after another. Is the distinction important here?
Are you thinking of the Riemann series theorem? That doesn’t apply when the payoff matrix for each bet is the same (and finite).
No, it was this thing. I just couldn’t articulate it.
A bounded utility function probably gets you out of all problems along those lines.
Certainly it’s good in the particular case: your expected utility (in the appropriate sense) is an increasing function of bets you accept and increasing sequences don’t have convergence issues.
How would you bound your utility function? Just pick some arbitrary converging function f, and set utility’ = f(utility)? That seems arbitrary. I suspect it might also make theorems about expectation maximization break down.
No, I’m not advocating changing utility functions. I’m just saying that if your utility function is bounded, you don’t have either of these problems with infinity. You don’t have the convergence problem nor the original problem of probability of the good outcome going to zero. Of course, you still have the result that you keep making bets till your utility is maxed out with very low probability, which bothers some people.