I take the axiom of independence to be tier two: an intuitively strong rationality principle, but not one that’s enforced by nasty things that happen if we violate it. It surprises me that I’ve only seen this kind of justification for one of the four VNM axioms. Actually, I suspect that independence could be justified in a tier-one way; it’s just that I haven’t seen it.
Suppose A < B but pA+(1-p)C > pB + (1-p)C. A genie offers you a choice between pA+(1-p)C and pB + (1-p)C, but charges you a penny for the former. Then if A is supposed to happen rather than C, the genie offers to make B happen instead, but will charge you another penny for it. If you pay two pennies, you’re doing something wrong. (Of course, these money-pumping arguments rely on the possibility of making arbitrarily small side payments.)
I think many people would put continuity at tier two, a strong intuitive principle. I don’t see why, personally. For me, it seems like an assumption which only makes sense if we already have the intuition that expected utility is going to be the right way of doing things. This puts it in tier 3 for me; another structural axiom.
Sure, it is structural, but your description of structural axioms made it sound like something it would be better if you didn’t have to accept, in case they end up not being true, which would be very inconvenient for the theorem. But if the continuity axiom is not an accurate description of your preferences, pretending it is changes almost nothing, so accepting the continuity axiom anyway seems well-justified from a pragmatic point of view. See this and this (section “Doing without Continuity”) for explanations.
Savage chooses not to define probabilities on a sigma-algebra. I haven’t seen any decision-theorist who prefers to use sigma-algebras yet. Similarly, he only derives finite additivity, not countable additivity; this also seems common among decision theorists.
This is annoying. Does anyone here know why they do this? My guess is that it’s because their nice theorems about the finite case don’t have straightforward generalizations that refer to sigma-algebras (I’m guessing this mainly because it appears to be the case for the VNM theorem, which only works if lotteries can only assign positive probability to finitely many outcomes).
Savage chooses not to define probabilities on a sigma-algebra. I haven’t seen any decision-theorist who prefers to use sigma-algebras yet. Similarly, he only derives finite additivity, not countable additivity; this also seems common among decision theorists.
This is annoying. Does anyone here know why they do this? My guess is that it’s because their nice theorems about the finite case don’t have straightforward generalizations that refer to sigma-algebras (I’m guessing this mainly because it appears to be the case for the VNM theorem, which only works if lotteries can only assign positive probability to finitely many outcomes).
Is it indeed the case that the VNM theorem cannot be generalized to the measure-theoretic setting?
Hypothesis: Consider X a compact Polish space. Let R⊆P(X)×P(X) be closed in the weak topology and satisfy the VNM axioms (in the sense that μ≤ν iff (μ,ν)∈R). Then, there exists u:X→R continuous s.t.(μ,ν)∈R iff Eμ[u]≤Eν[u].
Counterexamples?
One is also tempted to conjecture a version of the above where X is just a measurable space, R is closed in the strong convergence topology and u is just measurable. However, there’s the issue that if u is not bounded from either direction, there will be μ s.t.Eμ[u] is undefined. Does it mean u automatically comes out bounded from one direction? Or that we need to add an additional axiom, e.g. that there exists μ which is a global minimum (or maximum) in the preference ordering?
Both of your conjectures are correct. In the measurable / strong topology case, u will necessarily be bounded (from both directions), though it does not follow that the bounds are achievable by any probability distribution.
I described the VNM theorem as failing on sigma-algebras because the preference relation being closed (in the weak or strong topologies) is an additional assumption, which seems much more poorly motivated than the VNM axioms (in Abram’s terminology, the assumption is purely structural).
I think that one can argue that a computationally bounded agent cannot reason about probabilities with infinite precision, and that therefore preferences have to depend on probabilities in a way which is in some sense sufficiently regular, which can justify the topological condition. It would be nice to make this idea precise. Btw, it seems that the topological condition implies the continuity axiom.
Suppose A < B but pA+(1-p)C > pB + (1-p)C. A genie offers you a choice between pA+(1-p)C and pB + (1-p)C, but charges you a penny for the former. Then if A is supposed to happen rather than C, the genie offers to make B happen instead, but will charge you another penny for it. If you pay two pennies, you’re doing something wrong. (Of course, these money-pumping arguments rely on the possibility of making arbitrarily small side payments.)
Sure, it is structural, but your description of structural axioms made it sound like something it would be better if you didn’t have to accept, in case they end up not being true, which would be very inconvenient for the theorem. But if the continuity axiom is not an accurate description of your preferences, pretending it is changes almost nothing, so accepting the continuity axiom anyway seems well-justified from a pragmatic point of view. See this and this (section “Doing without Continuity”) for explanations.
This is annoying. Does anyone here know why they do this? My guess is that it’s because their nice theorems about the finite case don’t have straightforward generalizations that refer to sigma-algebras (I’m guessing this mainly because it appears to be the case for the VNM theorem, which only works if lotteries can only assign positive probability to finitely many outcomes).
Is it indeed the case that the VNM theorem cannot be generalized to the measure-theoretic setting?
Hypothesis: Consider X a compact Polish space. Let R⊆P(X)×P(X) be closed in the weak topology and satisfy the VNM axioms (in the sense that μ≤ν iff (μ,ν)∈R). Then, there exists u:X→R continuous s.t.(μ,ν)∈R iff Eμ[u]≤Eν[u].
Counterexamples?
One is also tempted to conjecture a version of the above where X is just a measurable space, R is closed in the strong convergence topology and u is just measurable. However, there’s the issue that if u is not bounded from either direction, there will be μ s.t.Eμ[u] is undefined. Does it mean u automatically comes out bounded from one direction? Or that we need to add an additional axiom, e.g. that there exists μ which is a global minimum (or maximum) in the preference ordering?
Both of your conjectures are correct. In the measurable / strong topology case, u will necessarily be bounded (from both directions), though it does not follow that the bounds are achievable by any probability distribution.
I described the VNM theorem as failing on sigma-algebras because the preference relation being closed (in the weak or strong topologies) is an additional assumption, which seems much more poorly motivated than the VNM axioms (in Abram’s terminology, the assumption is purely structural).
I think that one can argue that a computationally bounded agent cannot reason about probabilities with infinite precision, and that therefore preferences have to depend on probabilities in a way which is in some sense sufficiently regular, which can justify the topological condition. It would be nice to make this idea precise. Btw, it seems that the topological condition implies the continuity axiom.