You must not fool yourself, and you are the easiest person to fool
The title of this post comes from Richard Feynman’s famous talk about cargo cult science. The principle is very broadly applicable, though. Applying the steps I’ve described in this sequence can feel very satisfying, and even give you a sense that you’ve unlocked something important which puts you in a superior position to others. But it’s crucially important, when doing so, to account for self-deception: your mind is constantly trying to fool you into believing whatever it thinks it’s in your interests to believe. In particular, lying to ourselves about how prosocial our motives are makes it easier to portray ourselves positively to others.
This core mechanism is very well-established, but it’s still an open question how prevalent self-deception is throughout society. My opinion is that it’s very widespread, and still dramatically underrated as a lens for explaining aspects of human minds and societies. Simler and Hanson summarize a wide range of examples from domains such as art, charity, education, politics, and religion. But the area where self-deception seems most prominent to me is self-improvement. I expect that almost nothing in this sequence is really new to most people, and that at some level almost everyone already understands the main ways in which they’re being self-destructive, and what it would take for them to stop. When you make a commitment, you know if you’re probably not going to keep it; when you get angry, you know if you’re trying to find excuses to lash out; when you get drunk, you know if you’re taking it as an opportunity to do things you can’t justify when sober. We just don’t want to admit any of this to ourselves.
From this perspective, what’s missing is a type of courage—but one much harder than the courage required to face an enemy in front of you. This is the courage of deciding to face up to the harmful consequences of your actions even given a chance to ignore them. And it’s even more than that: it’s the courage of admitting something that’ll make the people who are most important to you lose respect for you—because many parts of you believe that self-deception is what props up your respect for yourself. There’s a very powerful truth in the classic quote from Marianne Williamson: “Our deepest fear is not that we are inadequate. Our deepest fear is that we are powerful beyond measure.” If you actually had the power to get what you wanted all along, then it would be your fault if you didn’t in the past, and if you don’t in the future. Failure is scary, but being responsible for failure is terrifying—so in the short term it feels much better to suppress knowledge of the ways in which you do have power, even if in the long term that prevents you from realizing your dreams.
The same is true of other people, of course. So when trying to cultivate trust, you need to gain trust not only in other people’s conscious intentions, but also their unconscious intentions. Whenever there’s wiggle room or ambiguity, their unconscious intentions will steer towards the outcome that the most emotionally assertive parts of them want. (For an exquisite portrayal of this, check out the TV show Fleabag—particularly season 2 episode 4, which can be watched standalone.)
Of course, when I say that “at some level almost everyone already understands the main ways in which they’re being self-destructive”, the phrase “at some level” is doing a lot of work. But I don’t intend this to be a wishy-washy claim about your deepest subconscious. So I’ll more specifically claim: if you could listen in to your thoughts from the outside, as if they belonged to somebody else, it would be clear within a matter of hours or days how your mind is flinching away from fears—whether by rationalizing them away, or hiding them underneath other emotions. There’s no silver bullet for tackling either of these, but they’re often associated with strong emotions which recur in a wide range of situations. Throughout this sequence I’ve described various ways to understand such emotions; the self-deception frame can be seen as one more tool for doing so, by asking: if a part of me was using these emotions to try to hide something, what would it be?
To finish this post, I want to reflect on how it relates to the material in my previous post about agency. In his blog post Keep Your Identity Small, Paul Graham argues that “people can never have a fruitful argument about something that’s part of their identity”, and that “the best plan is to let as few things into your identity as possible”. I partly agree with this: strong identities are a crucial driver of self-deception. For example, my conception of myself as highly growth-oriented has in the past led me to suppress or ignore discomfort about pushing my limits, which later sparked internal pushback against this type of self-coercion. These dynamics are often particularly strong when the identity involved is a collective identity of some kind (e.g. belonging to an ideology or political faction).
But I think Paul goes too far: by keeping your identity too small, you lose out on the benefits of self-reinforcing loops towards having high agency and other desirable traits (which can also be induced even more strongly by collective identities). So how can we balance these benefits and risks? My recommendation is to schedule regular check-ups to question whether the identities you hold strongly are still useful for you, and whether there are any ways in which they might be hurting you or holding you back. The key is to find a sweet spot which balances two failure modes: questioning the identity constantly would undermine its benefits, while questioning it very rarely might mean you spend a long time blindly going down a dead-end path. (The same approach can be useful for relationships: go all-in for one month, then re-evaluate, then three months, then re-evaluate, then six months, etc.) If you don’t think that you’ll actually be able to reliably reevaluate a part of your identity, you can commit to doing it with trusted friends, or start off by holding the identity weakly and cautiously at first. Ultimately, though, there’s no fail-safe way to proceed: we’ll have narratives about ourselves either way, and we just need to do our best to make sure they work for us and not the other way around.
I’m not quite sure whether this resonates with me personally. It’s all worded fairly strongly/confidently/universally, which probably helps it be motivational for a subset of people at the expense of being epistemically robust.
I had a thought on this line:
I think this is a totally true sentence, but I’d come at it from a frame that’s not rooted in “figure out courage or self-deception in particular” – I think when I’m having having a strong, persistent emotion, that’s a good sign that “something is up” that I should investigate, but I’d do so with a pretty open mind about what you might find.
See tuning your emotional processing, and the notion of “keeping emotional-inbox-zero” so problems don’t accumulate and build on each other.
Ty for the feedback, seems fair + useful. I’ve just gone through and added more epistemic signposting + softened the language at various points. Was this the main reason you kept it a personal blogpost?
Also agree. Have now edited the relevant paragraph to read:
To give an example of a popular self-help book which advocates for intentionally adopting identities, here are some quotes from Atomic Habits by James Clear:
Is there a way to unify Paul Graham’s “Keep Your Identity Small” with James Clear’s recommendation to adopt identities in order to build Atomic Habits?
Paul Graham mostly talks about the inability to have fruitful discussions about things people have strong feelings about. James Clear, in contrast, is interested in individual self-improvement, i.e. in changing and sustaining behaviors and habits.
So maybe we could unify the two perspectives by suggesting that adopting an identity indeed makes some thoughts harder to think or discuss (and other thoughts easier to think or discuss), but that when it comes to behavior change, this can be an upside. For example, if you’ve adopted the identity “I’m fit”, then the thought “I don’t want to go to the gym today” becomes harder to think and to act on.
One approach would be to identify as someone who believes whatever is correct, rather than as someone who believes particular positions X, Y, Z. That might have some value. However, it might then make it painful to confront the possibility that you might have been wrong, which could motivate you to interpret new evidence in a way that favors your existing beliefs, or to avoid looking for new evidence.
So probably a better approach is to identify as someone who always goes through proper reasoning and epistemic behaviors (always, or at least usually). Someone who is happy to pick up new evidence, and cares more about doing proper reasoning than about reaching a particular conclusion; someone who will cheerfully admit that a belief was wrong, and would find it more embarrassing to admit “I stuck to that belief longer than I should have”.
Marianne Williamson wouldn’t be my top choice for “source of wisdom vis a vis not fooling yourself,” but if the shoe fits?