I suppose it depends what kinds of decisions you’re talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs—so it seems there are lots of situations where performance doesn’t appear to degrade nearly so sharply.
At any rate, I guess I’m with ciphergoth, the more interesting question is why 99% accurate is “maybe maybe” okay, but 95% is “hell no”. Where do those numbers come from?
No one gets it 99% right. (Modulo my expectation that we are speaking only of questions of a minimal difficulty; say, at least as difficult as the simplest questions that the person has never considered before.)
When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me. Lack of bias was much more important than % right.
You’re onto me. Yes, that’s with a large corpus. The kind you get when people encrypt non-textual information. So, I lied a little. You need a bigger bulge with shorter messages.
I suppose it depends what kinds of decisions you’re talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs—so it seems there are lots of situations where performance doesn’t appear to degrade nearly so sharply.
At any rate, I guess I’m with ciphergoth, the more interesting question is why 99% accurate is “maybe maybe” okay, but 95% is “hell no”. Where do those numbers come from?
Someone who gets it 99% right is useful to me, someone who gets it 95% right is so much work to deal with that I usually don’t bother.
No one gets it 99% right. (Modulo my expectation that we are speaking only of questions of a minimal difficulty; say, at least as difficult as the simplest questions that the person has never considered before.)
When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me. Lack of bias was much more important than % right.
From a curious non-cryptographer: what size of corpus are you talking about here?
You’re onto me. Yes, that’s with a large corpus. The kind you get when people encrypt non-textual information. So, I lied a little. You need a bigger bulge with shorter messages.
I didn’t mean to call you out—I was just curious. A curve of data set size versus required bulge would be interesting.
In that case, a second information source of that quality wouldn’t have been that much use to you.
The first person who gets it 95% right would be very valuable. But there are diminishing returns.