On Being Robust
Inspired in part by Being a Robust Agent. Flipside of Half-assing it with everything you’ve got.
Do you ever feel… fake? Like, at any minute, Scooby Doo and the gang might roll up and unmask you as a freeloading fraud impostor in front of everyone?
There are a lot of things to say about the impostor syndrome on a psychological basis (the fears are often unrealistic / unmerited, etc). But I’d like to take another angle. For a few years, I’ve tried to just make a habit of being un-unmaskable. Although this is a useful frame for me, your mileage may vary.
My point isn’t going to just be “do the things you know you should”. I think we’re often bad at judging when corners are okay to cut, so you probably do better just by having the policy of not cutting corners, unless it’s extremely obviously alright to do so. That is, generally err against using scissors when confronted with corners, even if it makes sense in the moment.
Concrete examples
Making insights truly a part of you. This doesn’t mean one should freak out about the Math Gestapo checking whether you’ve memorized what Jordan normal form is. Rather… when I was just beginning to learn formal proof-based math, I worried “I’m about to go work with some of the smartest people in the world, and they’ll instantly see I’m a fake who just picked up shallow knowledge”. The internal response was “just get good enough that in no conceivable world could you be a fake who secretly can’t do formal math”.
Working out regularly, taking care of the small things, building the key good habits. Having your shit together.
Learning a lot of related areas, just in case they have key insights.
Regularly and automatically backing up your files, in multiple locations.
Using a password manager to generate and store strong passwords, automatically syncing your database over Dropbox, etc.
Rather than having embarrassing things on Facebook which you hope people won’t find, just use a tool to search-and-delete incriminating cringey material from your past.
Keep your exhaustive resume up-to-date, using a slick template like you know you should.
Following best practices (e.g. when writing code, so there isn’t a secret layer of gross code underneath the most prominent functions; when dealing with git repos, so future collaboration / merging works out okay).
Responding to emails after reading them. Not leaving people on
read
by mistake (I’m bad at this, actually).Using spellcheck on your documents.[1]
Scheduling meetings and showing up on time by leaving a lot earlier. Avoiding the planning fallacy. Setting multiple alarms before flights.
Having enough slack.
The general philosophy
This robustness is a kind of epistemic humility—it’s the kind of reasoning that robustly avoids the planning fallacy, only generalized. It’s the kind of reasoning that double-checks answers before turning in the test. It’s best practices, but for your own life.
I try to live my mental life such that, if people could read my thoughts, they would think I’m doing things right. That doesn’t mean I’m always being polite to people in my mind, but it means that I’m not being deceitful, or unfair, or secretly cutting corners on work I’m doing for them.[2]
Again, the point isn’t “have good habits and be happy”. The point is that I think we often cut too many corners, and so I recommend a policy which leans towards not cutting corners (even when it locally makes sense). The benefits for me have been twofold: getting better results, and feeling more secure about myself while getting those results.
Hmm, this all roughly makes sense, but I feel like there was some kind of important generator here that you were aiming to convey that I didn’t get.
I think you should probably do most of these things, but not sure which order to do them in, and meanwhile, I think so long as you’re afraid of being unmasked part of the problem seems like it’s about the fear itself?
I think the important generator is: being robust seems like a solution to this “generalized planning fallacy”[1], where you don’t correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can’t tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.
TBC, the main point isn’t that people should do these specific things per se, the main thing is the overall mindset.
This is what I was getting at with
I think the fear itself is the key problem with impostor syndrome, and I wasn’t trying to say “just be so good you feel secure” should be the main line of attack on that insecurity.
I don’t particularly like this name, but it’s just a temporary handle. In the planning fallacy, you optimistically stress-test your planned schedule. In this case, you optimistically stress-test your life in a broader sense; being robust attempts to counter that.
Ah, that does make the point much clearer, thanks!
Awesome. I should also note this generator is post hoc; I was (trying to) do this for a few years before I was even thinking about the planning fallacy.
This reflect both a couple of comments I’ve made regarding rules versus analyzing/optimizing as well as very unclear thought I’ve had bouncing around in my head for a little while now. The though is about the tendency of discussion here being very formal and model oriented, as if we really can optimize in our daily lives like occurs in the theoretical world of our minds and equations and probabilities. I’m not saying that approach is something to avoid, only that it does tent to set the focus on a view of precision that probably does not obtain for most in a majority of situations. (The recent post about acting on intuitions rather than the calculations, then tracking those over a year or two to see what you learn fits here too.)
This rule approach clearly takes that every case decision analysis choice away, if one buys into following the rule. However, we all know that in some cases you can get away with violating the rule (and perhaps even should violate the rule—or revise as has been suggested in other threads/comments as I recall). At the same time if can be difficult, as you mention, to know just which cases are good for violating the general rule. I would add that it might not be enough to keep track of when we violate the rule and what the results were—the probably hangs on just how representative the individual cases are and how well we can tell the differences in any truly complex case.
This seems all very paradoxical to me, and generally I can deal with paradox and a contradictory world (or perhaps just in my own behavior when I try viewing it from an outside perspective). Still, I find myself wanting to wrap my own thinking around the situation be bit better as I read various approaches or thoughts by people here.
This seems like another angle on “Play in Hard Mode”. Is that about right?
Yeah, I think that’s quite close to this concept—thanks for the link.