And also because they make the same predictions, that relative probability is irrelevant in practice: we could use AGR just as well as GR for predictions.
There is a subtle sense in which the difference between AGR and GR is relevant. While the difference doesn’t change the predictions, it may change the utility function. An agent that cares about angels (if they exist) might do different things if it believes itself to be in AGR world than in GR world. As the theories make identical predictions, the agents belief only depends on its priors (and any irrationality), not on which world it is in. Nonetheless, this means that the agent will pay to avoid having its priors modified. Even though the modification doesn’t change the agents predictions in the slightest.
There is a subtle sense in which the difference between AGR and GR is relevant. While the difference doesn’t change the predictions, it may change the utility function. An agent that cares about angels (if they exist) might do different things if it believes itself to be in AGR world than in GR world. As the theories make identical predictions, the agents belief only depends on its priors (and any irrationality), not on which world it is in. Nonetheless, this means that the agent will pay to avoid having its priors modified. Even though the modification doesn’t change the agents predictions in the slightest.