Firstly, note that countable additivity can only fail in one direction: the probability of a countable union of disjoint events can be strictly larger than the sum of the probabilities but it can’t be strictly smaller.
Now suppose for a moment that we’re only interested in discrete random variables that take at most countably many values. Then it’s easy to see that any additive probability measure m which assigns probabilities to each singleton set can be uniquely decomposed as the sum of a sigma-additive measure k and an additive measure m* such that m*(S) = 0 for any finite set S. Perhaps m is one of a set of hypotheses that we’re trying to evaluate by applying Bayesian reasoning to a data sample (consisting of some integer values). But then whichever data we receive, m* will assign it zero probability, and therefore drop out as irrelevant. The posterior probabilities will depend only on the sigma-additive components k.
Clearly the continuous case is more complicated—I don’t know whether one can make an analogous argument. (Perhaps it would depend on denying that we ever receive ‘real numbers’ in our data. Rather, all we get are finite amounts of information about each real.)
Perhaps you want something like a “Dutch book” argument to show that dropping countable additivity is irrational? Well, any such argument would presuppose that if you would accept each of the wagers A[n] then you must also accept the sum of the A[n] (as long as it’s well-defined), but as long as you reject that presupposition, you remain perfectly self-consistent.
In your example with m*, what if we have a random variable that takes values equal to 1/n, for integer n? Then maybe we measure whether n is even. Wouldn’t the probability of this would depend on m* as well as k?
For the case of a Dutch book, the problem is that wagers do not satisfy countable additivity. Maybe we could identify some subset of wagers that do and use this in a proof?
I guess there are lots of things to say here.
Firstly, note that countable additivity can only fail in one direction: the probability of a countable union of disjoint events can be strictly larger than the sum of the probabilities but it can’t be strictly smaller.
Now suppose for a moment that we’re only interested in discrete random variables that take at most countably many values. Then it’s easy to see that any additive probability measure m which assigns probabilities to each singleton set can be uniquely decomposed as the sum of a sigma-additive measure k and an additive measure m* such that m*(S) = 0 for any finite set S. Perhaps m is one of a set of hypotheses that we’re trying to evaluate by applying Bayesian reasoning to a data sample (consisting of some integer values). But then whichever data we receive, m* will assign it zero probability, and therefore drop out as irrelevant. The posterior probabilities will depend only on the sigma-additive components k.
Clearly the continuous case is more complicated—I don’t know whether one can make an analogous argument. (Perhaps it would depend on denying that we ever receive ‘real numbers’ in our data. Rather, all we get are finite amounts of information about each real.)
Perhaps you want something like a “Dutch book” argument to show that dropping countable additivity is irrational? Well, any such argument would presuppose that if you would accept each of the wagers A[n] then you must also accept the sum of the A[n] (as long as it’s well-defined), but as long as you reject that presupposition, you remain perfectly self-consistent.
You make a lot of interesting points.
In your example with m*, what if we have a random variable that takes values equal to 1/n, for integer n? Then maybe we measure whether n is even. Wouldn’t the probability of this would depend on m* as well as k?
For the case of a Dutch book, the problem is that wagers do not satisfy countable additivity. Maybe we could identify some subset of wagers that do and use this in a proof?