Carry around a notepad, form probabilistic opinions on lots of little questions that you can find out the answer to soon after, record all the probabilities assigned to correct answers, where applicable add tags like “politics”, “project completion”, “my social status”, “trivia”, put into a spreadsheet or something and see if you’re miscalibrated globally and for different tags.
This can get gamed pretty easily though, by selecting things that you have more previous knowledge of or know the actual probabilities of over things that you know are more likely to be wrong.… realization
Except that that could be exactly the point, the ability to identify what you know you are likely to assign accurate probabilities for and identifying when you aren’t as likely.
However, there still is the problem of just not reporting certain things to boost your scores. There could be something that takes into account or measures the ability to identify when you are likely to be wrong.
If you break the habit of claiming confidence you don’t really have, to improve your score, then it seems the exercise has had the intended effect, no?
Or: guess confidence intervals. 95% might not be as useful as 50%; test yourself not only on how often you are under or over, but make sure that 50% (or %5) of the time it is outside the range you guessed.
If you try to guess things that you’re really sure about, this forces you to quantify how sure you are about that, and makes those guesses no more or less useful than those that you are much less sure about.
Carry around a notepad, form probabilistic opinions on lots of little questions that you can find out the answer to soon after, record all the probabilities assigned to correct answers, where applicable add tags like “politics”, “project completion”, “my social status”, “trivia”, put into a spreadsheet or something and see if you’re miscalibrated globally and for different tags.
This can get gamed pretty easily though, by selecting things that you have more previous knowledge of or know the actual probabilities of over things that you know are more likely to be wrong.… realization
Except that that could be exactly the point, the ability to identify what you know you are likely to assign accurate probabilities for and identifying when you aren’t as likely. However, there still is the problem of just not reporting certain things to boost your scores. There could be something that takes into account or measures the ability to identify when you are likely to be wrong.
If you break the habit of claiming confidence you don’t really have, to improve your score, then it seems the exercise has had the intended effect, no?
Or: guess confidence intervals. 95% might not be as useful as 50%; test yourself not only on how often you are under or over, but make sure that 50% (or %5) of the time it is outside the range you guessed.
If you try to guess things that you’re really sure about, this forces you to quantify how sure you are about that, and makes those guesses no more or less useful than those that you are much less sure about.
How do I tell?