This seems like a dead thread, but I’ll chance it anyway.
Elizer, there’s something off about your calculation of the expected score:
The expected score is something that should go up the more certain I am of something, right?
But in fact the expected score is highest when I’m most uncertain about something: If I believe with equal probability that snow might be white and non-white, the expected score is actually 0.5(-1) + 0.5(-1) = −1. This is the highest possible expected score.
In any other case, the expected score will be lower, as you calculate for the 70⁄30 case.
It seems like what you should be trying to do is minimize your expected score but maximize your actual score. That seems weird.
This seems like a dead thread, but I’ll chance it anyway.
Elizer, there’s something off about your calculation of the expected score:
The expected score is something that should go up the more certain I am of something, right?
But in fact the expected score is highest when I’m most uncertain about something: If I believe with equal probability that snow might be white and non-white, the expected score is actually 0.5(-1) + 0.5(-1) = −1. This is the highest possible expected score.
In any other case, the expected score will be lower, as you calculate for the 70⁄30 case.
It seems like what you should be trying to do is minimize your expected score but maximize your actual score. That seems weird.
Looks like you’ve just got a sign error, anukool_j. −1 is the lowest possible expected score. The expected score in the 70⁄30 case is −0.88.
Graph.