I can’t extract any meaning from these percentages. Well over 99% of an ordinary person’s beliefs are true, because they are about prosaic, uncontroversial things like “I have fingernails”.
I can’t either, but my basic reaction is simply that in practice purity is critical here. If, in order to act correctly, a person needs to do more than 70 cognitive things correctly, their expected value falls by half for every 1% that they are wrong.
In practice, if you are only talking about the 70 most important steps that people are prone to messing up, that could easily be correct. Not to mention the probability of doing harm. Certainly there are a lot more than 10 steps that people are prone to messing up which reduce value by more than 80% in practice.
I suppose it depends what kinds of decisions you’re talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs—so it seems there are lots of situations where performance doesn’t appear to degrade nearly so sharply.
At any rate, I guess I’m with ciphergoth, the more interesting question is why 99% accurate is “maybe maybe” okay, but 95% is “hell no”. Where do those numbers come from?
No one gets it 99% right. (Modulo my expectation that we are speaking only of questions of a minimal difficulty; say, at least as difficult as the simplest questions that the person has never considered before.)
When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me. Lack of bias was much more important than % right.
You’re onto me. Yes, that’s with a large corpus. The kind you get when people encrypt non-textual information. So, I lied a little. You need a bigger bulge with shorter messages.
I was thinking exactly the same thing. I have literally no idea what ‘percentage’ of the things I believe are true, and certainly wouldn’t be willing to put a figure on what percentage is acceptable.
If it’s literally 99%, then maybe maybe. If it’s actually more like 95%, then hell no.
I can’t extract any meaning from these percentages. Well over 99% of an ordinary person’s beliefs are true, because they are about prosaic, uncontroversial things like “I have fingernails”.
I can’t either, but my basic reaction is simply that in practice purity is critical here. If, in order to act correctly, a person needs to do more than 70 cognitive things correctly, their expected value falls by half for every 1% that they are wrong.
Assuming any action anywhere short of optimal results in zero value, sure. In practice?
In practice, if you are only talking about the 70 most important steps that people are prone to messing up, that could easily be correct. Not to mention the probability of doing harm. Certainly there are a lot more than 10 steps that people are prone to messing up which reduce value by more than 80% in practice.
I suppose it depends what kinds of decisions you’re talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs—so it seems there are lots of situations where performance doesn’t appear to degrade nearly so sharply.
At any rate, I guess I’m with ciphergoth, the more interesting question is why 99% accurate is “maybe maybe” okay, but 95% is “hell no”. Where do those numbers come from?
Someone who gets it 99% right is useful to me, someone who gets it 95% right is so much work to deal with that I usually don’t bother.
No one gets it 99% right. (Modulo my expectation that we are speaking only of questions of a minimal difficulty; say, at least as difficult as the simplest questions that the person has never considered before.)
When I was a cryptographer, an information source with a .000001% bulge (information content above randomness) would break a code wide open for me. Lack of bias was much more important than % right.
From a curious non-cryptographer: what size of corpus are you talking about here?
You’re onto me. Yes, that’s with a large corpus. The kind you get when people encrypt non-textual information. So, I lied a little. You need a bigger bulge with shorter messages.
I didn’t mean to call you out—I was just curious. A curve of data set size versus required bulge would be interesting.
In that case, a second information source of that quality wouldn’t have been that much use to you.
The first person who gets it 95% right would be very valuable. But there are diminishing returns.
I was thinking exactly the same thing. I have literally no idea what ‘percentage’ of the things I believe are true, and certainly wouldn’t be willing to put a figure on what percentage is acceptable.