it’s often stated that believing that you’ll succeed actually causes you to be more likely to succeed. there are immediately obvious explanations for this—survivorship bias. obviously most people who win the lottery will have believed that buying lottery tickets is a good idea, but that doesn’t mean we should take that advice. so we should consider the plausible mechanisms of action.
first, it is very common for people with latent ability to underestimate their latent ability. in situations where the cost of failure is low, it seems net positive to at least take seriously the hypothesis that you can do more than you think you can. (also keeping in mind that we often overestimate the cost of failure). there are also deleterious mental health effects to believing in a high probability of failure, and then bad mental health does actually cause failure—it’s really hard to give something your all if you don’t really believe in it.
belief in success also plays an important role in signalling. if you’re trying to make some joint venture happen, you need to make people believe that the joint venture will actually succeed (opportunity costs exist). when assessing the likelihood of success of the joint venture, people will take many pieces of information into account: your track record, the opinions of other people with a track record, object level opinions on the proposal, etc.
being confident in your own venture is an important way of putting your “skin in the game” to vouch that it will succeed. specifically, the way this is supposed to work is that you get punished socially for being overconfident, so you have an incentive to only really vouch for things that really will work. in practice, in large parts of the modern world overconfidence is penalized less than we’re hardwired to expect. sometimes this is due to regions with cultural acceptance and even embrace of risky bets (SV), or because of atomization of modern society making the effects of social punishment less important.
this has both good and bad effects. it’s what enables innovation, because that fundamentally requires a lot of people to play the research lottery. if you’re not willing to work on something that will probably fail but also will pay out big if it succeeds, it’s very hard to innovate. research consists mostly of people who are extremely invested in some research bet, to the point where it’s extremely hard to convince them to pivot if it’s not working out. ditto for startups, which are probably the architypical example of both innovation and also of catastrophic overconfidence.
this also creates problems—for instance, it enables grifting because you don’t actually need to have to be correct if you just claim that your idea will work, and then when it inevitably fails you can just say that this is par for the course. also, being systematically overconfident can cause suboptimal decision making where calibration actually is important.
because many talented people are underequipped with confidence (there is probably some causal mechanism here—technical excellence often requires having a very mechanistic mental model of the thing you’re doing, rather than just yoloing it and hoping it works), it also creates a niche for middlemen to supply confidence as a service, aka leadership. in the ideal case, this confidence is supplied by people who are calibratedly confident because of experience, but the market is inefficient enough that even people who are not calibrated can supply confidence because of the market inefficiency. another way to view this is that leaders deliver the important service of providing certainty in the face of an uncertain world.
(I’m using the term middleman here in a sense that doesn’t necessarily imply that they deliver no value—in fact, causing things to happen can create lots of value, and depending on the specifics this role can be very difficult to fill. but they aren’t the people who do the actual technical work. it is of course also valuable for the leader to e.g be able in theory to fill any of the technical roles if needed, because it makes them more able to spend their risk budget on the important technical questions, it creates more slack and thereby increases the probability of success, and the common knowledge of the existence of this slack itself also increases the perceived inevitability of success)
a similar story also applies at the suprahuman level, of tribes or ideologies. if you are an ideology, your job is unfortunately slightly more complicated. on the one hand, you need to project the vibe of inevitable success so that people in other tribes feel the need to get in early on your tribe, but on the other hand you need to make your tribe members feel like every decision they make is very consequential for whether the tribe succeeds. if you’re merely calibrated, then only one of the two can be true. different social technologies are used by religions, nations, political movements, companies, etc to maintain this paradox.
it’s often stated that believing that you’ll succeed actually causes you to be more likely to succeed. there are immediately obvious explanations for this—survivorship bias. obviously most people who win the lottery will have believed that buying lottery tickets is a good idea, but that doesn’t mean we should take that advice. so we should consider the plausible mechanisms of action.
first, it is very common for people with latent ability to underestimate their latent ability. in situations where the cost of failure is low, it seems net positive to at least take seriously the hypothesis that you can do more than you think you can. (also keeping in mind that we often overestimate the cost of failure). there are also deleterious mental health effects to believing in a high probability of failure, and then bad mental health does actually cause failure—it’s really hard to give something your all if you don’t really believe in it.
belief in success also plays an important role in signalling. if you’re trying to make some joint venture happen, you need to make people believe that the joint venture will actually succeed (opportunity costs exist). when assessing the likelihood of success of the joint venture, people will take many pieces of information into account: your track record, the opinions of other people with a track record, object level opinions on the proposal, etc.
being confident in your own venture is an important way of putting your “skin in the game” to vouch that it will succeed. specifically, the way this is supposed to work is that you get punished socially for being overconfident, so you have an incentive to only really vouch for things that really will work. in practice, in large parts of the modern world overconfidence is penalized less than we’re hardwired to expect. sometimes this is due to regions with cultural acceptance and even embrace of risky bets (SV), or because of atomization of modern society making the effects of social punishment less important.
this has both good and bad effects. it’s what enables innovation, because that fundamentally requires a lot of people to play the research lottery. if you’re not willing to work on something that will probably fail but also will pay out big if it succeeds, it’s very hard to innovate. research consists mostly of people who are extremely invested in some research bet, to the point where it’s extremely hard to convince them to pivot if it’s not working out. ditto for startups, which are probably the architypical example of both innovation and also of catastrophic overconfidence.
this also creates problems—for instance, it enables grifting because you don’t actually need to have to be correct if you just claim that your idea will work, and then when it inevitably fails you can just say that this is par for the course. also, being systematically overconfident can cause suboptimal decision making where calibration actually is important.
because many talented people are underequipped with confidence (there is probably some causal mechanism here—technical excellence often requires having a very mechanistic mental model of the thing you’re doing, rather than just yoloing it and hoping it works), it also creates a niche for middlemen to supply confidence as a service, aka leadership. in the ideal case, this confidence is supplied by people who are calibratedly confident because of experience, but the market is inefficient enough that even people who are not calibrated can supply confidence because of the market inefficiency. another way to view this is that leaders deliver the important service of providing certainty in the face of an uncertain world.
(I’m using the term middleman here in a sense that doesn’t necessarily imply that they deliver no value—in fact, causing things to happen can create lots of value, and depending on the specifics this role can be very difficult to fill. but they aren’t the people who do the actual technical work. it is of course also valuable for the leader to e.g be able in theory to fill any of the technical roles if needed, because it makes them more able to spend their risk budget on the important technical questions, it creates more slack and thereby increases the probability of success, and the common knowledge of the existence of this slack itself also increases the perceived inevitability of success)
a similar story also applies at the suprahuman level, of tribes or ideologies. if you are an ideology, your job is unfortunately slightly more complicated. on the one hand, you need to project the vibe of inevitable success so that people in other tribes feel the need to get in early on your tribe, but on the other hand you need to make your tribe members feel like every decision they make is very consequential for whether the tribe succeeds. if you’re merely calibrated, then only one of the two can be true. different social technologies are used by religions, nations, political movements, companies, etc to maintain this paradox.