When one researcher booted up a program he hoped would be AI-like, Yudkowsky said he believed there was a 5 percent chance the Singularity was about to happen
Um what? Being cautious in one’s predictions is being a dumbass now?
It isn’t caution to overestimate. Being risk averse with an accurate probability assessment is not the same thing as having overestimated probabilities. (Although the issue of the general tendency to overestimate one’s confidence is certainly relevant.)
HAHA. What a dumbass.
Hmm. This certainly wasn’t helpful, and doesn’t score well on the comedy scale. Definitely downvoted.
Um what? Being cautious in one’s predictions is being a dumbass now?
It isn’t caution to overestimate. Being risk averse with an accurate probability assessment is not the same thing as having overestimated probabilities. (Although the issue of the general tendency to overestimate one’s confidence is certainly relevant.)
There is another sense in which a probability estimate can be cautious: by not being too close to 0 or 1.
Overestimating the probability of X is just underestimating the probability of not(X).