What about the idea of “well calibrated judgement”, where I must be right 9 times of 10 when I say something has 90 per cent probability? Yudkowsky discussed it in the article “Cognitive biases in global risks”, if I correctly remember.
In that case I assigned some probability distribution over my judgements which could be about completely different external objects?
I’m not quite sure what you mean here, but I don’t think the idea of calibration is directly related to the subjective/objective dichotomy. Both subjective and objective Bayesians could desire to be well calibrated.
What about the idea of “well calibrated judgement”, where I must be right 9 times of 10 when I say something has 90 per cent probability? Yudkowsky discussed it in the article “Cognitive biases in global risks”, if I correctly remember.
In that case I assigned some probability distribution over my judgements which could be about completely different external objects?
I’m not quite sure what you mean here, but I don’t think the idea of calibration is directly related to the subjective/objective dichotomy. Both subjective and objective Bayesians could desire to be well calibrated.