Has Eliezer made explicit updates about this? Maybe @Rob Bensinger knows. If he has, I’d like to see it posted prominently and clearly somewhere. Either way, I wonder why he doesn’t mention it more often. Maybe he does, but only in fiction.
[...] I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.
Does this come up in the Dath Ilan stories?
There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, “Make the students practice this for an hour, then test them two weeks later.” Not, “Run half the signups through version A of the three-month summer training program, and half through version B, and survey them five years later.”
Surely there is more to say about this now than in 2009. Eliezer had some idea of the replication crisis back then, but I think he has become much more pessimistic about academia in the time since.
But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.
I think there’s gotta be more to say about this too. Since then we have seen Tetlock’s Superforecasting, Inadequate Equilibria[1], the confusing story of CFAR[2][3][4], and the rise to prominence of EA. I can now read retrospectives by accomplished rationalists arguingover whether rationality increases accomplishment, but I always come away feeling highly uncertain. (Not epistemically helpless, but frustratingly uncertain.) What do we make of all this?
Eliezer asks:
Why are there schools of martial arts, but not rationality dojos? (This was the first question I asked in my first blog post.) Is it more important to hit people than to think?
My answer, which gets progressively less charitable, and is aimed at no one in particular: thinking rationally appears to be a lower priority than learning particular mathematical methods, obtaining funding and recruits, mingling at parties, following the news, scrolling social media, and playing videogames.
Has Eliezer made explicit updates about this? Maybe @Rob Bensinger knows. If he has, I’d like to see it posted prominently and clearly somewhere. Either way, I wonder why he doesn’t mention it more often. Maybe he does, but only in fiction.
Does this come up in the Dath Ilan stories?
Surely there is more to say about this now than in 2009. Eliezer had some idea of the replication crisis back then, but I think he has become much more pessimistic about academia in the time since.
I think there’s gotta be more to say about this too. Since then we have seen Tetlock’s Superforecasting, Inadequate Equilibria[1], the confusing story of CFAR[2][3][4], and the rise to prominence of EA. I can now read retrospectives by accomplished rationalists arguing over whether rationality increases accomplishment, but I always come away feeling highly uncertain. (Not epistemically helpless, but frustratingly uncertain.) What do we make of all this?
Eliezer asks:
My answer, which gets progressively less charitable, and is aimed at no one in particular: thinking rationally appears to be a lower priority than learning particular mathematical methods, obtaining funding and recruits, mingling at parties, following the news, scrolling social media, and playing videogames.
Consensus is that Eliezer verifiably outperformed the medical establishment with the lumenator, right?
https://www.lesswrong.com/posts/B9kP6x5rpmuCzpfWb/comment-reply-my-low-quality-thoughts-on-why-cfar-didn-t-get
https://rationality.org/studies/2015-longitudinal-study
https://www.lesswrong.com/posts/5K7CMa6dEL7TN7sae/3-levels-of-rationality-verification#AZb895EcZBdueP6QC
https://www.lesswrong.com/posts/MajyZJrsf8fAywWgY/a-lesswrong-crypto-autopsy
https://www.astralcodexten.com/p/why-im-less-than-infinitely-hostile