I was recently listening to a podcast discussion that included two people that had been involved in military special operations units—one in the Navy SEALs and one in the US Army Special Forces. I was struck by their extremely high level of training, dedication, commitment, and overall ability—but also by also how this had in large part been squandered on fighting a destructive and unproductive war in Afghanistan, supporting questionable CIA operations, and so on.
It occurs to me that people in the rationalist community are at least notionally working on much more important causes, but with far less training, commitment, personal skill, etc.
This leads to my question—what would it look like if similar levels of training effort, funding, selection, etc. were going into preparing rationalists to do as much good in the world as possible as is currently going into preparing elite military units? (I don’t think that this would necessarily look much like elite military training, to be clear!)
If this first exercise comes up with anything that seems promising—are there ways that we could potentially 80⁄20 this and get much of the goods without high costs?
(nb: this post is just personal musings and not “official” on behalf of CFAR or any other organization.)
I think I would be a much better-trained rationalist if I did my basic rationality practices as regularly as I do physical exercise. The practices are:
I keep a list of things I have changed my mind about. Everything from geopolitics to my personal life.
I hardly ever go looking outside my echo chamber in search for things that can challenge/correct my beliefs because I find it very effortful & unpleasant. (You know what else is effortful? Pushups).
I sometimes write letters to my future or past selves. I tried giving comprehensive life advice to my 16-year-old self, and ended up learning a lot about advice and spurious counterfactuals...
I sometimes do the Line of Retreat negative visualization. For immediate things, I tend to do it out loud while on a walk. For political beliefs, I slowly add to a private document over time and occasionally review it.
I maintain a list of my disagreements with various public thinkers. Helps me separate tribal thinking from truth-seeking.
I made an Anki deck for memorizing my defensive epistemology heuristics: “is this explainable by selection effects?”, Proving Too Much, “is this claim consistent with their previous claim?”, Reversal Test, etc.
I notice I’m at a point where I can make surprisingly good fermi estimates if I spend a couple minutes thinking really hard, usually not otherwise. Feels like there’s room for improvement.
Hard to practice with regularity, but these days I try to restrain myself from joining into an in-progress debate when I overhear one, and instead sit on the sideline and patiently wait for openings to point out (double) cruxes.
Prompt myself to answer, “what would a slightly improved version of me do in this situation? What would I think if I were more rested and better hydrated?” It’s embarrassing how much mileage I have gotten out of role-playing as myself.
Privately journaling about my internal conflicts or difficult feelings. Simple but underpracticed (much like sit-ups).
I wrote down a page of predictions about the state of crypto tech in 2031, aiming for maximum specificity & minimal future embarrassment. Similar for Twitter in this post. I guess I might call this “conscientious futurism” or just “sticking my neck out”.
Pro/Con lists. They’re effortful & time-intensive. But so is half-focused vacillating, which is what I do by default.
So yeah, those are my rationality exercises, and I really wish I practiced them more regularly. It’s not exactly high-level SEAL-inspired training, and it’s pretty hard to verify, but...it feels like it makes me more rational.
The Navy SEALs not only spend a lot of training effort, funding and selection on training individuals but they spent a good portion on research into how to train.
One aspects to researching how to train an ability is to have a way to measuring progress. I think the Navy SEALs put their trainees at the end of the training through tests to evaluate whether to give them proper Navy SEAL status, so you likely can focus the training on improving on clear metrics.
If we had clear metrics based on which we could measure progress on training rationality, we could put efforts into maximizing those.
This may be a result of selection—the military is a couple of orders of magnitude bigger than the rationalist community, and you heard the best of the best that they have.
True, but the mechanisms that cause people to want to join the military (and elite military units in particular) are in my view in scope for this discussion. What would it look like for the rationalist community to be a thing that many intelligent, highly motivated people aspire to join?
My impression is that SEALs are exceptional as a team, much less individually. Their main individual skill is extreme team-mindedness.
This post inspired https://www.lesswrong.com/posts/RdCb8EGEEdWbwvqcp/why-not-more-small-intense-research-teams
Maybe if we can identify an enemy who’s going to shoot at us, we can select and instill that level of commitment. I suspect it comes from a pre-rational part of human motivation, and is not available to the vast majority of rationalists.
After the training begins, something like 80% of the recruits drop out during Hell Week. Seals are selected for their motivation, which is not available to everyone headed for a warzone.
On the other hand, if you’d really like an existential treat to get you going, you may consider looking into the problem of goal alignment in AGI, or aging.