It actually just occurred to me that the intelligence professions might benefit greatly from some x-rationality. We may not have to derive gravity from an apple, but the closer we come to that ideal, the less likely failures of intelligence become.
Intelligence professionals are constantly engaged a very Bayesian activity, incorporating new data into estimates of probabilities and patterns. An ideal Bayesian would be a fantastic analyst.
Ja, in particular modern intelligence professionals seem to have problems with separating out the information they get from others and the information they’re trying to pass on themselves, reporting only their final combined judgment instead of their likelihood-message, which any student of Bayes nets knows is Wrong.
It actually just occurred to me that the intelligence professions might benefit greatly from some x-rationality. We may not have to derive gravity from an apple, but the closer we come to that ideal, the less likely failures of intelligence become.
Intelligence professionals are constantly engaged a very Bayesian activity, incorporating new data into estimates of probabilities and patterns. An ideal Bayesian would be a fantastic analyst.
Ja, in particular modern intelligence professionals seem to have problems with separating out the information they get from others and the information they’re trying to pass on themselves, reporting only their final combined judgment instead of their likelihood-message, which any student of Bayes nets knows is Wrong.