the agents trying to make money would over time develop more specific beliefs, reaching more substantive agreement than they had initially. It’s like the difference between agreeing with someone that there’s a 50% chance a coin will turn up heads, and agreeing with someone that there’s a 99% chance that a coin will turn up heads; the second agreement is more substantive even if there is agreement about probabilities in both cases
In Popperian epistemology, it’s a virtue to propose hypotheses that are easily disproven...which isn’t the same thing as always incrementally moving towards truth: it’s more like babble-and-prune. Of course, the instruction to converge on truth doesnt quite say “get closer to truth in every step—no backtracking”—it’s just that Bayesians are likely to take it that way.
And of course, epistemology is unsolved. No one can distill the correct theoretical epistemology into practical steps, because no one knows what it is ITFP.
In Popperian epistemology, it’s a virtue to propose hypotheses that are easily disproven...which isn’t the same thing as always incrementally moving towards truth: it’s more like babble-and-prune. Of course, the instruction to converge on truth doesnt quite say “get closer to truth in every step—no backtracking”—it’s just that Bayesians are likely to take it that way.
And of course, epistemology is unsolved. No one can distill the correct theoretical epistemology into practical steps, because no one knows what it is ITFP.