Probably, but precision matters. Mixing up mean vs sum when talking about different quantities of lives is confusing. We do agree that it’s all about how to convert to utilities. I’m not sure we agree on whether 2x the number of equal-value lives is 2x the utility. I say no, many Utilitarians say yes (one of the reasons I don’t consider myself Utilitarian).
game which maximizes log utility and still leaves you with nothing in 99% of cases.
Again, precision in description matters—that game maximizes log wealth, presumed to be close to linear utility. And it’s not clear that it shows what you think—it never leaves you nothing, just very often a small fraction of your current wealth, and sometimes astronomical wealth. I think I’d play that game quite a bit, at least until my utility curve for money flattened even more than simple log, due to the fact that I’m at least in part a satisficer rather than an optimizer on that dimension. Oh, and only if I could trust the randomizer and counterparty to actually pay out, which becomes impossible in the real world pretty quickly.
But that only shows that other factors in the calculation interfere at extreme values, not that the underlying optimization (maximize utility, and convert resources to utility according to your goals/preferences/beliefs) is wrong.
Probably, but precision matters. Mixing up mean vs sum when talking about different quantities of lives is confusing. We do agree that it’s all about how to convert to utilities. I’m not sure we agree on whether 2x the number of equal-value lives is 2x the utility. I say no, many Utilitarians say yes (one of the reasons I don’t consider myself Utilitarian).
Again, precision in description matters—that game maximizes log wealth, presumed to be close to linear utility. And it’s not clear that it shows what you think—it never leaves you nothing, just very often a small fraction of your current wealth, and sometimes astronomical wealth. I think I’d play that game quite a bit, at least until my utility curve for money flattened even more than simple log, due to the fact that I’m at least in part a satisficer rather than an optimizer on that dimension. Oh, and only if I could trust the randomizer and counterparty to actually pay out, which becomes impossible in the real world pretty quickly.
But that only shows that other factors in the calculation interfere at extreme values, not that the underlying optimization (maximize utility, and convert resources to utility according to your goals/preferences/beliefs) is wrong.