Does “sure” mean 100% confidence? If so, is this a correct statement?
Or would it be more correct to say: - we’re extraordinarily confident that Newton’s gravitation is close to correct, - we’re extraordinarily confident that Einstein’s gravitation is even closer, - we’re mildly confident that we will find no closer theories, though one alternative to explaining dark matter would be modified gravitation, so we’re considerably less confident than we would be if there were no known evidence suggesting inaccuracies in Einstein’s gravitation, by a factor of P(Einstein|DarkMatter).
This (the ignoring of cost) seems like a flaw to Bayesian analysis, and makes me think there’s probably some extension to it, which is being omitted here for simplicity, but which takes into account something like cost, value, or utility.
For example, the “cost” of a bayesian filter deciding to show a salesman a spam email is far lower than the “cost” of the same filter deciding to prevent them from seeing an email from a million-dollar sales lead.
So, while the calculation of probabilities should not take into account cost, it feels like the making decisions of based on those probabilities should take cost into account.
For example: the chances of our getting wiped out in the near future by a natural disaster. Yet, the potential consequences are dire, and the net costs per person of detection are low, or even negative. Therefore, we have a global near-earth-object detection network, a tsunami and quake detection network, fire watch towers, weather and climate monitors, disease tracking centers, and so on.
If this extension to Bayesian analysis exists, this seem a sensible place to link to it.