I agree; the typical human brain balks and runs away when faced with a scale of merit whose max-point is 0.
Zero does seem more appropriate either as a minimum or a midpoint. If everything is going to be negative then flip it around and say ‘less is good’! But the main problem I have with only losing honor based on making predictions is that it essentially rewards never saying anything of importance that could be contradicted. That sounds a bit too much like real life for some reason. ;)
There’s gotta be a way to fix this so that a perfectly calibrated person would gain a tiny amount of honor each day rather than lose it. It might not be elegant, though. Got any ideas?
The tricky part is not so much making up the equations but in determining what criteria to rate the scale against. We would inevitably be injecting something arbitrary.
You’re supposed to have a probability for everything. The closest you can do to not guessing is give every possibility equal probabilities, in which case you’d lose honor even faster than normal.
You could give yourself honor equal to the square of the probability you gave, but that means you’d have incentive to phrase it in as many questions possible. After all, if you gave a single probability for what happens for your entire life, you couldn’t get more than one point of honor. With the system I mentioned first, you’d lose exactly the same honor.
Zero does seem more appropriate either as a minimum or a midpoint. If everything is going to be negative then flip it around and say ‘less is good’! But the main problem I have with only losing honor based on making predictions is that it essentially rewards never saying anything of importance that could be contradicted. That sounds a bit too much like real life for some reason. ;)
The tricky part is not so much making up the equations but in determining what criteria to rate the scale against. We would inevitably be injecting something arbitrary.
You’re supposed to have a probability for everything. The closest you can do to not guessing is give every possibility equal probabilities, in which case you’d lose honor even faster than normal.
You could give yourself honor equal to the square of the probability you gave, but that means you’d have incentive to phrase it in as many questions possible. After all, if you gave a single probability for what happens for your entire life, you couldn’t get more than one point of honor. With the system I mentioned first, you’d lose exactly the same honor.