Following curi’s steps, we’d lower our standards. How do you feel about the theory “I don’t want to spend more time on this and getting 1000 heads if it’s double-headed is 2^1000 more likely than getting 1000 heads if it’s ordinary so I’ll make the same decisions I’d make if I knew it were double-headed unless I get a rough estimate of at least a factor of 2^990 difference in how much I care about the outcome of one of those decisions”.
If Bayes generates the right answer here, whereas naive Popperian reasoning without it goes spectacularly wrong, maybe this should be suggesting something. Also it ignores my main point that Poppers theory does not admit weak criticisms, of which the coin coming up heads is just one example.
Following curi’s steps, we’d lower our standards. How do you feel about the theory “I don’t want to spend more time on this and getting 1000 heads if it’s double-headed is 2^1000 more likely than getting 1000 heads if it’s ordinary so I’ll make the same decisions I’d make if I knew it were double-headed unless I get a rough estimate of at least a factor of 2^990 difference in how much I care about the outcome of one of those decisions”.
What you appear to be suggesting amounts to Bayesian epistemology done wrong.
For coin flipping analysis, use Bayes’ theorem (not Bayesian epistemology).
If Bayes generates the right answer here, whereas naive Popperian reasoning without it goes spectacularly wrong, maybe this should be suggesting something. Also it ignores my main point that Poppers theory does not admit weak criticisms, of which the coin coming up heads is just one example.
Whether you have a double-headed coin or not is still a form of knowledge.
The Bayes’ theorem:good, Bayesian epistemology:bad perspecitive won’t wash.