Nice work, congrats! Looks fun and useful, better than the calibration apps I’ve seen so far (including one I made, that used confidence intervals—I had a proper scoring rule too!)
My score:
Current score: 3.544 after 10 plays, for an average score per play of 0.354.
For example, I was thinking of running it on nodejs and logging the scores of players, so you could see how you compare. (I don’t have a way to host this, right now, though.)
Or another possibility is to add diagnostics. E.g. were you setting your guess too high systematically or was it fluctuating more than the data would really say it should (under some models for the prior/posterior, say).
Also, I’d be happy to have pointers to your calibration apps or others you’ve found useful.
Nice work, congrats! Looks fun and useful, better than the calibration apps I’ve seen so far (including one I made, that used confidence intervals—I had a proper scoring rule too!)
My score:
Thanks Emile,
Is there anything you’d like to see added?
For example, I was thinking of running it on nodejs and logging the scores of players, so you could see how you compare. (I don’t have a way to host this, right now, though.)
Or another possibility is to add diagnostics. E.g. were you setting your guess too high systematically or was it fluctuating more than the data would really say it should (under some models for the prior/posterior, say).
Also, I’d be happy to have pointers to your calibration apps or others you’ve found useful.