What more could you want? A theorem proving that any optimal decision theory must necessarily use Bayesian updating?
There already is such a theorem. From Wikipedia:
A decision-theoretic justification of the use of Bayesian inference was given by Abraham Wald, who proved that every Bayesian procedure is admissible. Conversely, every admissible statistical procedure is either a Bayesian procedure or a limit of Bayesian procedures.
As far as I can tell from wikipedia’s description of admissibility, it makes the same assumptions as CDT: That the outcome depends only on your action and the state of the environment, and not on any other properties of your algorithm. This assumption fails in multi-player games.
So your quote actually means: If you’re going to use CDT then Bayes is the optimal way to derive your probabilities.
There already is such a theorem. From Wikipedia:
As far as I can tell from wikipedia’s description of admissibility, it makes the same assumptions as CDT: That the outcome depends only on your action and the state of the environment, and not on any other properties of your algorithm. This assumption fails in multi-player games.
So your quote actually means: If you’re going to use CDT then Bayes is the optimal way to derive your probabilities.