In that case, we’re done. Standard probability theory/Cox Theorem/de Finette would give us a ready made criticism of any conjecture that wasn’t isomorphic to probability theory, so we’d have isomorphism, which is all we need. Once we have functional equivalence, we can prove results in probability theory, apply Bayes theorem, etc., and then at the end translate back into Popperesque.
(Also, IIRC, Jaynes only claimed to have proven that rational reasoning must be isomorphic to probability theory)
I don’t quite get your point. You are saying that if you bring up betting (a real life scenario where probability is highly relevant), then given your explanations that help you come up with priors (background knowledge needed to be able to do any math about it), you shouldn’t act on those explanations in ways that violates math. OK, so what? probability math is useful in some limited cases, given some explanatory knowledge to get set up. no one said otherwise.
I think you are beginning to get the point. :) The key missing fact here is that in fact the resulting math is highly constraining, to the point that if you actually follow it all the way you will be acting in a manner isomorphic to a Bayesian utility-maximizer.
But the background knowledge part is highly not-constraining (just given your math). When a math algorithm gives constrained output, but you have wide scope for choice of input, it’s not so good. you need to do stuff to constrain the inputs.
it seems to me you just dump all the hard parts of thinking into the priors and then say the rest follows. but the hard parts are still there. we still need to work out good explanations to use as input for the last step of not doing stuff that violates math/logic.
In that case, we’re done. Standard probability theory/Cox Theorem/de Finette would give us a ready made criticism of any conjecture that wasn’t isomorphic to probability theory, so we’d have isomorphism, which is all we need. Once we have functional equivalence, we can prove results in probability theory, apply Bayes theorem, etc., and then at the end translate back into Popperesque.
(Also, IIRC, Jaynes only claimed to have proven that rational reasoning must be isomorphic to probability theory)
I don’t quite get your point. You are saying that if you bring up betting (a real life scenario where probability is highly relevant), then given your explanations that help you come up with priors (background knowledge needed to be able to do any math about it), you shouldn’t act on those explanations in ways that violates math. OK, so what? probability math is useful in some limited cases, given some explanatory knowledge to get set up. no one said otherwise.
Every decision is a bet.
I think you are beginning to get the point. :) The key missing fact here is that in fact the resulting math is highly constraining, to the point that if you actually follow it all the way you will be acting in a manner isomorphic to a Bayesian utility-maximizer.
But the background knowledge part is highly not-constraining (just given your math). When a math algorithm gives constrained output, but you have wide scope for choice of input, it’s not so good. you need to do stuff to constrain the inputs.
it seems to me you just dump all the hard parts of thinking into the priors and then say the rest follows. but the hard parts are still there. we still need to work out good explanations to use as input for the last step of not doing stuff that violates math/logic.