This whole thing reminds me of Scott Alexander’s Pyramid essay. That seems like a really good case where it seems like there’s a natural statistical reference class, seems like you can easily get a giant Bayes factor that’s “statistically well justified”, and to all the counterarguments you can say “well the likelihood is 1 in 10^5 that the pyramids would have a latitude that matches to the speed of light in m/s”. That’s a good reductio for taking even fairly well justified sounding subjective bayes factors at face value.
And I think that it’s built into your criticism that because the problem is social and hidden evidence filtering going on, there will also tend to be an explanation on the meta-level too for why my coincidence finding is different from your coincidence finding.
This whole thing reminds me of Scott Alexander’s Pyramid essay. That seems like a really good case where it seems like there’s a natural statistical reference class, seems like you can easily get a giant Bayes factor that’s “statistically well justified”, and to all the counterarguments you can say “well the likelihood is 1 in 10^5 that the pyramids would have a latitude that matches to the speed of light in m/s”. That’s a good reductio for taking even fairly well justified sounding subjective bayes factors at face value.
And I think that it’s built into your criticism that because the problem is social and hidden evidence filtering going on, there will also tend to be an explanation on the meta-level too for why my coincidence finding is different from your coincidence finding.