You are absolutely right, any framework that punishes you for being right would be bad, my point is that increasing your calibration helps a surprising amount and is much more achievable than “just git good” which is required for improving prediction.
I will try to put your point into the draft when I am off work , thanks
You are absolutely right, any framework that punishes you for being right would be bad, my point is that increasing your calibration helps a surprising amount and is much more achievable than “just git good” which is required for improving prediction.
I will try to put your point into the draft when I am off work , thanks