On honest businesses, I’d expect successful ones to involve overconfidence on average because of winner’s curse. But I’d also expect people who found successful businesses to correctly reject most ideas they consider, and overconfidence to cause them to select the wrong business plan. It’s possible that this is true but founders to switch to overconfidence once they’re committed, perhaps as part of a red queen’s race where founders who don’t exaggerate their business’s prospects don’t attract employees, investors, customers, etc.
As far as the motivation of the founders themselves, though, if success requires overconfidence, it seems to me like either that implies that these businesses actually weren’t worth it in expectation for their founders (in which case founders are harmed, not helped, by overconfidence), or there’s some failure of truthseeking where people with nominally calibrated beliefs irrationally avoid positive-EV ventures.
Eliezer seems like something of a counterexample here—if I recall correctly his prior estimate that HPMoR would do what it did (recollected after the fact) was something like 10%, he just tries lots of things like that, when they’re EV-positive, and some of them work.
Secular Solstice seems basically honest. Speculative bubbles cause harm in part because they tend to be based on distorted narratives about underlying facts, such that participating in the bubble implies actually having the facts wrong (e.g. that some type of business is very profitable). With a process that plays out entirely at a single level of social reality (the holiday’s real iff enough people declare it real), there’s not some underlying thing being distorted.
Something else that may be confusing here is that in our language, predictions and performative statements of intent often sound identical. Strunk and White’s discussion of the difference between “will” and “shall” is a clear example of an intent to promote distinguishing these. Again, there could be incentives pointing towards exaggeration (and recasting statements of intent as predictions).
EA and LessWrong seem like examples of things that succeeded at becoming movements despite a lot of skepticism expressed early on. On the other hand, a substantial amount of the promotion for these has been promotion of still-untested vaporware, and I’m not sure I uniformly regret this.
Definitely agree that a good business person needs to be epistemically sound enough to pick good plans. I think the idealized honest business person is something like Nate Soares, separating their ability to feel conviction from their ability to think honestly. But I think that’s beyond most people (in practice, if not in theory). And I think most business ventures, even good ones you have reason for confidence in, would still have a less than 50% success rate. (I think the “switch to overconfidence once you commit” strategy is probably good for most people)
(I do think you can select business ventures that would generate value even if the business ends up folding within a few years, and that an ecosystem that pushed more towards that than random-pie-in-the-sky-with-VC-exploitation-thrown in is probably better. But because of the competitive nature of business, it’s hard for something to end up with a greater than 50% success rate)
[mental flag: I notice that I’ve made a prediction that’s worth me spending another hour or two thinking about, fleshing out the underlying model of and/or researching]
> Eliezer...
Oddly enough I actually have a memory of Eliezer saying both the thing you just referenced and also the opposite (i.e, that he tries a lot of things and you don’t see the ones that don’t succeed, but also that he had a pretty good idea of what he was doing with HPMOR, and while it succeeded better than he expected… he did have a pretty good model of how fanfiction and memespreading worked and that he did expect it to work “at all” or something)
Your recollection about Eliezer seems right to me.
Also I guess you’re right on idealized businessperson vs nearly every actual one. But it’s worth noting that “X is very rare” is more than enough reason for X to be rare among successes.
[Note: This comment seems pretty pedantic in retrospect. Posting anyway to gauge reception, and because I’d still prefer clarity.]
On honest businesses, I’d expect successful ones to involve overconfidence on average because of winner’s curse.
I’m having trouble understanding this application of winner’s curse.
Are you saying something like the following:
People put in more resources and generally try harder when they estimate a higher chance of success. (Analogous to people bidding more in an auction when they estimate a higher value.)
These actions increase the chance of success, so overconfident people are overrepresented among successes.
This overrepresentation holds even if the “true chance of success” is the main factor. Overconfidence of founders just needs to shift the distribution of successes a bit, for “successful ones to involve overconfidence on average”.
First, this seems weird to me because I got the impression that you were arguing against overconfidence being useful.
Second, are you implying that successful businesses have on average “overpaid” for their successes in effort/resources? That is central to my understanding of winner’s curse, but maybe not yours.
You get winner’s curse even if results are totally random. If you have an unbiased estimator with a random error term, and select only the highest estimate in your sample, the expected error is positive (i.e. you probably overestimated it).
On honest businesses, I’d expect successful ones to involve overconfidence on average because of winner’s curse. But I’d also expect people who found successful businesses to correctly reject most ideas they consider, and overconfidence to cause them to select the wrong business plan. It’s possible that this is true but founders to switch to overconfidence once they’re committed, perhaps as part of a red queen’s race where founders who don’t exaggerate their business’s prospects don’t attract employees, investors, customers, etc.
As far as the motivation of the founders themselves, though, if success requires overconfidence, it seems to me like either that implies that these businesses actually weren’t worth it in expectation for their founders (in which case founders are harmed, not helped, by overconfidence), or there’s some failure of truthseeking where people with nominally calibrated beliefs irrationally avoid positive-EV ventures.
Eliezer seems like something of a counterexample here—if I recall correctly his prior estimate that HPMoR would do what it did (recollected after the fact) was something like 10%, he just tries lots of things like that, when they’re EV-positive, and some of them work.
Secular Solstice seems basically honest. Speculative bubbles cause harm in part because they tend to be based on distorted narratives about underlying facts, such that participating in the bubble implies actually having the facts wrong (e.g. that some type of business is very profitable). With a process that plays out entirely at a single level of social reality (the holiday’s real iff enough people declare it real), there’s not some underlying thing being distorted.
Something else that may be confusing here is that in our language, predictions and performative statements of intent often sound identical. Strunk and White’s discussion of the difference between “will” and “shall” is a clear example of an intent to promote distinguishing these. Again, there could be incentives pointing towards exaggeration (and recasting statements of intent as predictions).
EA and LessWrong seem like examples of things that succeeded at becoming movements despite a lot of skepticism expressed early on. On the other hand, a substantial amount of the promotion for these has been promotion of still-untested vaporware, and I’m not sure I uniformly regret this.
Definitely agree that a good business person needs to be epistemically sound enough to pick good plans. I think the idealized honest business person is something like Nate Soares, separating their ability to feel conviction from their ability to think honestly. But I think that’s beyond most people (in practice, if not in theory). And I think most business ventures, even good ones you have reason for confidence in, would still have a less than 50% success rate. (I think the “switch to overconfidence once you commit” strategy is probably good for most people)
(I do think you can select business ventures that would generate value even if the business ends up folding within a few years, and that an ecosystem that pushed more towards that than random-pie-in-the-sky-with-VC-exploitation-thrown in is probably better. But because of the competitive nature of business, it’s hard for something to end up with a greater than 50% success rate)
[mental flag: I notice that I’ve made a prediction that’s worth me spending another hour or two thinking about, fleshing out the underlying model of and/or researching]
> Eliezer...
Oddly enough I actually have a memory of Eliezer saying both the thing you just referenced and also the opposite (i.e, that he tries a lot of things and you don’t see the ones that don’t succeed, but also that he had a pretty good idea of what he was doing with HPMOR, and while it succeeded better than he expected… he did have a pretty good model of how fanfiction and memespreading worked and that he did expect it to work “at all” or something)
Your recollection about Eliezer seems right to me.
Also I guess you’re right on idealized businessperson vs nearly every actual one. But it’s worth noting that “X is very rare” is more than enough reason for X to be rare among successes.
[Note: This comment seems pretty pedantic in retrospect. Posting anyway to gauge reception, and because I’d still prefer clarity.]
I’m having trouble understanding this application of winner’s curse.
Are you saying something like the following:
People put in more resources and generally try harder when they estimate a higher chance of success. (Analogous to people bidding more in an auction when they estimate a higher value.)
These actions increase the chance of success, so overconfident people are overrepresented among successes.
This overrepresentation holds even if the “true chance of success” is the main factor. Overconfidence of founders just needs to shift the distribution of successes a bit, for “successful ones to involve overconfidence on average”.
First, this seems weird to me because I got the impression that you were arguing against overconfidence being useful.
Second, are you implying that successful businesses have on average “overpaid” for their successes in effort/resources? That is central to my understanding of winner’s curse, but maybe not yours.
Sorry if I’m totally missing your point.
You get winner’s curse even if results are totally random. If you have an unbiased estimator with a random error term, and select only the highest estimate in your sample, the expected error is positive (i.e. you probably overestimated it).