It’s meaningless to talk about optimizing epistemic rationality without talking about your utility function. There are a lot of questions you could get better at answering. Which ones you want to answer depends on what kind of decisions you want to make, which depends on what you value.
But probabilities are a useful latent variable in the reasoning process, and it can be worthwhile instrumentally to try to have accurate beliefs, as this may help out in a wide variety of situations that we cannot predict in advance. So there is still the question of which beliefs it is most important to make more accurate.
Also, I believe the OP is trying to write code for a variant of the calibration game, so it is somewhat intrinsically necessary for him to score probabilities directly.
It’s meaningless to talk about optimizing epistemic rationality without talking about your utility function. There are a lot of questions you could get better at answering. Which ones you want to answer depends on what kind of decisions you want to make, which depends on what you value.
But probabilities are a useful latent variable in the reasoning process, and it can be worthwhile instrumentally to try to have accurate beliefs, as this may help out in a wide variety of situations that we cannot predict in advance. So there is still the question of which beliefs it is most important to make more accurate.
Also, I believe the OP is trying to write code for a variant of the calibration game, so it is somewhat intrinsically necessary for him to score probabilities directly.