Anyone know how predictions of less than 50% are supposed to be handled by PredictionBook? I predicted a thing would happen with 30% confidence. It happened. Am I supposed to judge the prediction right or wrong?
It shows me a graph of confidence/accuracy that starts from 50%, and I’m wondering if I’m supposed to be phrasing prediction in such a way that I always list >50% confidence (i.e. I should have predicted that X wouldn’t happen, with 70% confidence, rather than that it would, with 30%)
Judge it as “right”. PB automatically converts your 10% predictions into 90%-not predictions for the calibration graph, but under the hood everything stays with the probabilities you provided. Hope this cleared things up.
Another predictionBook question: it gives me a graph showing my 50/60/70/80/90% confidence accuracy, but I’m not sure if/how it interfaces with my 85%, 63%, etc, claims. Do those get rounded, or not show up at all?
Anyone know how predictions of less than 50% are supposed to be handled by PredictionBook? I predicted a thing would happen with 30% confidence. It happened. Am I supposed to judge the prediction right or wrong?
It shows me a graph of confidence/accuracy that starts from 50%, and I’m wondering if I’m supposed to be phrasing prediction in such a way that I always list >50% confidence (i.e. I should have predicted that X wouldn’t happen, with 70% confidence, rather than that it would, with 30%)
Judge it as “right”. PB automatically converts your 10% predictions into 90%-not predictions for the calibration graph, but under the hood everything stays with the probabilities you provided. Hope this cleared things up.
Another predictionBook question: it gives me a graph showing my 50/60/70/80/90% confidence accuracy, but I’m not sure if/how it interfaces with my 85%, 63%, etc, claims. Do those get rounded, or not show up at all?