The truth
The job market for recent PhD graduates is tough. Simply, there are many more degree holders than available tenure track positions. People who went through the market describe it as capricious, random, unfair and difficult. New professors often lament highly qualified colleagues who found no position. If your dissertation is on Witte and no Witte-teaching positions are available, tough luck. This evidence suggests that the market is inneficient and that my personal behavior has a small impact on the outcome.
But OTOH, there are reasons to believe that a candidates ability has a large influence. Firstly, the profile of the desired candidate is clear beforehand: deep drive for knowledge, record of successful publication, agreeableness toward colleagues, and teaching skills. Secondly, there is an abundance of clear signals of these attributes; quality of publications, research influence and size of network are all clear and hard to fake. Thirdly, candidate ability has high variance (the best researchers greatly outperform). Therefore hiring committees have tools and incentives to select high-quality researchers.
The truth likely lies between these two positions. Underperforming PhD’s rarely advance and many strong CV’s fail due to unpredictable changes in demand.
The Instrumental Truth
But perhaps believing the truth is against my interest. If I believe that my work is important to outcomes, I will work harder. If I believe my work has little importance, I may become lazy or seek side-hustles. Should I convince myself that my publication count and thesis quality are more important than is true?
I’d love people’s thoughts on this. Also link to great essays and blog posts about instrumental rationality!
Appendix
The best strategy is to remember that academia is a silly place. If the market is unkind don’t hesitate to go elsewhere. Not-academia has more money and other good stuff! Be careful not to murder-pill yourself into thinking academia is the whole world.
So I think the answer is somewhat complicated, and requires unpacking a few things. The big thing is that it is possible to commit without belief, many people find this hard to do without expending willpower, and so find it instrumentally useful to lie to themselves about what they believe, only possibly to find out their beliefs are not very malleable and they can’t successfully lie to themselves to achieve this end.
The best situation would be to believe the truth and do it anyway. This requires a level of non-identifcation with the belief, though, such that you can successfully invest in an uncertain outcome and be happy with the expected returns rather than the actual returns.
If that’s not possible, next best would be setting up incentives such that you don’t have to change your belief and can maintain beliefs you believe to be true but are nonetheless incentivized to do what you want yourself to do. This is painful for a lot of people because they feel themselves fighting the incentives they themselves set up, but it’s an option.
Epistemically the worst option is to lie to yourself, but also probably the least painful if the first option is not available. It will work so long as you can maintain the lie, but you might not like having to do all the work to maintain it, and it’ll inevitably poison other beliefs by the need to maintain the network of dependent beliefs that prop up the falsehood you’re maintaining. Not recommended, and you’ll create a lot of harm for yourself to unravel later.
Limiting my comments to the question at hand, but if you asked for my general opinion, it would be to have alternatives that let you out of the frame of the situation you’ve created so you don’t have to do this.
Some thoughts:
Regardless of what the truth is, it seems like working hard is the best way forward, rather than getting lazy or seeking side-hustles.
Is it possible for you to willingly choose to believe something if you don’t actually think it’s true? To what extent? I don’t think it’s a yes or no question.
Related to that, are there any downsides of anxiety? Ie. if you go down the path of believing something that isn’t true (or is stretching what is true), will you have a nagging feeling in the back of your mind that produces anxiety?
What are the downstream impacts of deviating from truth? Imagine the following scenario: reality is that your work doesn’t matter much; you choose to believe that it is for instrumental reasons; down the line there is a ton of evidence that your work matters very little but since you took the blue pill you aren’t able to process this evidence and update your beliefs. It seems to me that in order to justify deviating from truth, you have to be pretty confident that it’s the right choice because you give up some future flexibility.
I’m glad you’re asking this. I this idea of instrumental truth often times gets dismissed too quickly and I think it’s something that is worth talking about.
This isn’t obvious to me. If you are in, say, the bottom 90% and have no chance of getting an academic job no matter how hard you work, knowing as soon as possible and making exit plans seems like the best thing.
Hm, I was imagining it as a given that the chance of getting a job in academia is high enough for it to be worth continuing, and that from there the question is whether marginal effort on work is worth it. In that case where it’s a given that you’re continuing, it seems worth working hard on your research regardless of whether it’ll help much with job prospects.
To make this more concrete, imagine that we assume there’s a 70% chance of getting a job if you continue and do a normal amount of work. From there there’s the question of whether working extra hard bumps you up to 90 or 95%, or whether it only has a small impact like getting you to 75%. I see that as a separate (but related) question from the question of whether the baseline probability is high enough to continue with the degree.
It seems unlikely that a person who holds confused beliefs about the likelihood of getting a professorship is capable of making that judgement.
I agree the post strongly frames things as for potential winners who aren’t considering other options, but I think that’s a mistake for this exact reason. Even if a set that benefits from self-deceit exists, you can’t know if you’re in it while you’re self-deceiving.
What if the self-deceit is limited to a given area? Eg. in this case, if the student is honest about what the baseline likelihood of getting a job is (eg. 70%), makes the decision to continue with the program based on that, and then introduces self-deceit specifically for the question of whether working extra hard will do a lot to improve his chances. Isn’t it possible to self-deceive on the latter question without it spilling over into the former question?
I can’t say it’s impossible, but you would have to constantly go back and forth whenever you got new information, while hiding from yourself that you were doing so, and at a certain point it becomes psychologically easier to work hard despite a lack of guaranteed success.
I’m not sure if I’m interpreting what you mean by going back and forth correctly, but I think this will clarify.
If you’re going to self-deceive on eg. the former question of what the baseline likelihood of getting a job is, there is a large downside that you won’t be able to update (well) on new information. Eg. in the scenario where new information arises showing that you should actually quit the grad program, you wouldn’t be able to update on it and would end up making the wrong choice. So it’d only be appropriate to self-deceive in limited situations. Eg. if you’re highly, highly confident that the baseline likelihood of getting a job is large enough that you feel comfortable moving forward with the decision to continue with the program.
As to the question of how psychologically easy it is to self-deceive and successfully compartmentalize new information, I agree that it seems difficult, but I’m not sure. Perhaps it depends on your personality.
In my experience having different parts of oneself believe different things is not good for motivation.
I agree in general, but I can imagine exceptions to that rule.
If you think the baseline chance of getting a job in your field* is 70%, you’re either extraordinarily talented or a fairly delusional. I don’t know you, but I know how I’d bet it. Long story short—the market is absurdly competitive and you probably shouldn’t get a PhD at all. If you do try for it, you should realize that excessive work doesn’t guarantee success but failing to work excessively basically guarantees failure.
P.S. I have an Ivy League STEM PhD and am working as an accountant in the public sector (in other words, the higher end of working class or lower middle class). I’m not the only STEM PhD in my bureau.
*excluding adjunct positions paying sub-minimum wage.
My point wasn’t really about making a statement about what the baseline likelihood is, just that baseline likelihood vs benefit of marginal work are two separate questions, and more generally that the question of self-deceit is one that applies to many different sub-problems (self-deceive here but not there).
But yeah, I don’t doubt that you’re right about what the true baseline is. And in general the market sounds quite inefficient. I myself am going through that now as a programmer with six years of experience struggling to find a job.
Fair enough. I’ve never understood how “self-deceit” was supposed to work, though. Self—delusion is simple enough—you believe something that isn’t true. That’s probably universal. But self-deceit seems to require you to believe something that you don’t believe, and I don’t understand why you expect yourself to fall for it.
I certainly believe its possible. I have lots of objective measures of progress and ability I can compare to produce an outside estimate. The post doesn’t discuss this because I’ve already built mechanisms to prevent self-deception on the former question.
That’s true. I believe in solving this by writing clear conditions for withdrawing from the cult beforehand. Pull parachute rope if
No publications of note by fourth year
Can’t find editing-commenting exchanges in 3rd year
Have not picked a topic by end of third year
Have not finished thesis 6th year
I also picked a university in a high-employment city in my field to avoid being murder-pilled by the academic cult. I didn’t include these adaptations in the post to keep the focus on dark-side rationality.
I’m sorry if I’m missing a point, but here are two cents on the subject :
“Should I convince myself that my publication count and thesis quality are more important than is true?”
I think it’s hard to break down the problem at this only question : maybe the answer to “success” is to work differently, and not harder, thus it may be dangerous to be entangled in the sole idea of “working more”. Various skills are necessary and focusing too much on something may be harmfull.
(Also I’m only pointing this because my global opinion has already been expressed (: )
I can tell you this is a thing in PUA community. Believing that every girl wants you is incorrect, but more helpful than (also probably incorrect) belief that no girl wants you.
That makes me think of sports as a similar example. Eg. believing your defender can’t stop you in basketball.
Maybe. Believing that no girl wants you means you won’t try with girls and it becomes a self fulfilling prophecy. Believing every girl wants you means you will try and, even though you won’t bat 1000, you’re chances of success are literally infinitely higher.
I’m not a basketball expert at all, but I don’t think you can choose to avoid the defender so that seems like a different thing than what I was getting at.
I think that in both situations you want to have some sort of honest filter about which girl/defender to go after, but once you do make that decision, it is beneficial to think “I’ve got this”.
This post is about not having honest filters. Sounds like you disagree with that theory.
Which is fine, just explains where we are missing each other.
I see it differently. I think there are various places where you can ask the question “Do I want to apply an honest filter here?”. (Or rather, to what extent do I want to stretch the truth?)
The real solution is to drop out unless you are clearly a superstar or you are extremely close to finishing your PHD. No need to deny the truth.
I say this as someone who wasted three years in a PHD program. Luckily I got out when I did before I wasted more years or started a postdoc. If you are not a superstar start planning your exit.
If you have goals that justify working hard, like the satisfaction of a job well done; or hopes for a professor position; then do that.
If your goals justify not working hard, like enjoying your doctoral stipend while gliding through and then moving to some other profession, then do that.
If the only reason you are focusing on your thesis is to get a professorship, it sounds like you are not into doing the research for the right reasons.
How about choosing research topics that seem important enough that even in the case you don’t get professorship you could say it was worth investing yourself into them?
The post does not mention choosing research topics strategically, just the number and quality of contributions. I wouldn’t read too much into it.