(Continuity/Achimedean) This axiom (and acceptable weaker versions of it) is much more subtle that it seems; “No choice is infinity important” is what it seems to say, but ” ‘I could have been a contender’ isn’t good enough” is closer to what it does. Anyway, that’s a discussion for another time.
Here I’ll explain briefly what I mean by it. Let’s drop that axiom, and see what could happen. First of all, we could have a utility function with non-standard real value. This allows some things to be infinitely more important than others. A simple illustration is lexicographical ordering; eg my utility function consists of the amount of euros I end up owning, with the amount of sex I get serving as a tie-breaker.
There is nothing wrong with such a function! First, because in practice it functions as a standard utility function (I’m unlikely to be able to indulge in sex in a way that has absolutely no costs or opportunity costs, so the amount of euros will always predominate). Secondly because, even if it does make a difference… it’s still expected utility maximisation, just a non-standard version.
But worse things can happen if you drop the axiom. Consider this decision criteria: I will act so that, at some point, there will have been a chance of me becoming heavy-weight champion of the world. This is compatible with all the other vNM axioms, but is obviously not what we want as a decision criteria. In the real world, such decision criteria is vacuous (there is a non-zero chance of me becoming heavyweight champion of the world right now), but it certainly could apply in many toy models.
That’s why I said that the continuity axiom is protecting us from “I could have been a contender (and that’s all that matters)” type reasoning, not so much from “some things are infinitely important (compared to others)”.
Also notice that the quantum many-worlds version of the above decision criteria—“I will act so that the measure of type X universe is non-zero”—does not sound quite as stupid, especially if you bring in anthropics.
Continuity axiom of vNM
In a previous post, I left a somewhat cryptic comment on the continuity/Archimedean axiom of vNM expected utility.
(Continuity/Achimedean) This axiom (and acceptable weaker versions of it) is much more subtle that it seems; “No choice is infinity important” is what it seems to say, but ” ‘I could have been a contender’ isn’t good enough” is closer to what it does. Anyway, that’s a discussion for another time.
Here I’ll explain briefly what I mean by it. Let’s drop that axiom, and see what could happen. First of all, we could have a utility function with non-standard real value. This allows some things to be infinitely more important than others. A simple illustration is lexicographical ordering; eg my utility function consists of the amount of euros I end up owning, with the amount of sex I get serving as a tie-breaker.
There is nothing wrong with such a function! First, because in practice it functions as a standard utility function (I’m unlikely to be able to indulge in sex in a way that has absolutely no costs or opportunity costs, so the amount of euros will always predominate). Secondly because, even if it does make a difference… it’s still expected utility maximisation, just a non-standard version.
But worse things can happen if you drop the axiom. Consider this decision criteria: I will act so that, at some point, there will have been a chance of me becoming heavy-weight champion of the world. This is compatible with all the other vNM axioms, but is obviously not what we want as a decision criteria. In the real world, such decision criteria is vacuous (there is a non-zero chance of me becoming heavyweight champion of the world right now), but it certainly could apply in many toy models.
That’s why I said that the continuity axiom is protecting us from “I could have been a contender (and that’s all that matters)” type reasoning, not so much from “some things are infinitely important (compared to others)”.
Also notice that the quantum many-worlds version of the above decision criteria—“I will act so that the measure of type X universe is non-zero”—does not sound quite as stupid, especially if you bring in anthropics.