One option would be to have another percentage — a meta-percentage. e.g. “What credence do i give to “this is an accurate model of the world””? For coin flips, you’re 99.999% that 50% is a good model. For bloxors, you’re ~0% that 50% is a good model.
This is a model that I always tend to fall back on but I can never find a name for it so find it hard to look into. I have always figured I am misunderstanding Bayesian statistics and somehow credence is all factored in somehow. That doesn’t really seem like the case though.
Does the Scott Alexander post lay this out? I am having difficulty finding it.
The closest term I have been able to find is Kelly constants, which is a measure of how much “wealth” you should rationally put into a probabilistic outcome. Replace “wealth” with credence and maybe it could be useful for decisions but even this misses the point!
This is a model that I always tend to fall back on but I can never find a name for it so find it hard to look into. I have always figured I am misunderstanding Bayesian statistics and somehow credence is all factored in somehow. That doesn’t really seem like the case though.
Does the Scott Alexander post lay this out? I am having difficulty finding it.
The closest term I have been able to find is Kelly constants, which is a measure of how much “wealth” you should rationally put into a probabilistic outcome. Replace “wealth” with credence and maybe it could be useful for decisions but even this misses the point!
He doesn’t really. Here’s the original article:
https://www.astralcodexten.com/p/mr-tries-the-safe-uncertainty-fallacy
Also there was a long follow-up where he insists 50% is the right answer, but it’s subscriber-only:
https://www.astralcodexten.com/p/but-seriously-are-bloxors-greeblic