I disagree with your interpretation of what happened with respect to talent constraints. In addition, I have a meta-critique. In your hypothetical people talk about ‘talent constraints’ without citing any articles. But you don’t cite any articles either!
I think quite a lot of the basis for ’EA is talent constrained came from the 80K hours surveys. 20172018. Both surveys were quite detailed and cannot be quickly summarized. But the 2017 report literally says the following:
On a 0-4 scale EA organisations viewed themselves as 2.5 ‘talent constrained’ and 1.2 ‘funding constrained’, suggesting hiring remains the more significant limiting factor, though funding still does limit some.
‘EA is talent constrained’ was definitely not just a mistranslation, the explicit concept of ‘talent constrained’ vs ‘funding constrained’ was used on official surveys as recently as 2017. The surveys also make it clear that organizations considered keeping their recent hires to be worth seemingly incredible amounts of money. They were asked:
For a typical recent Senior/Junior hire, how much financial compensation would you need to receive today, to make you indifferent about that person having to stop working for you or anyone for the next 3 years?
Responses in 2017 and 2018 respectively for Senior/Junior hires:
80k Hours explicitly agrees the surveys were misleading for many readers and they will try to do better in the future. So there is no need to harp on them. But the meta point seems important here.
I probably agree with (or at least, would not be surprised if), 80k or CEA had also communicated poorly in contexts other than the initial post. (I also think there’s still plenty of room for the initial post to have been more clear).
FWIW I also think your summary of the 2015 article is inaccurate. For example, “EA needs very specific talents that are missing.” isn’t consistent with the section titled “Less Earning to Give”, which states very clearly that more than 20% of EAs, total, should be doing direct work. “EA needs lots of generally talented people” is a much better fit. My own experiences are consistent with that: the people I know who got career advice from 80k or other EA thought leaders in that era were all told to do direct work, typically operations at EA orgs.
Normally this wouldn’t be worth talking about; who really cares whether an article from 2015 was unclear, or clearly communicated something its authors now disagree with? Here I think the distinction matters, because it’s a load-bearing part of the argument that mentorship is a bottleneck for EA specifically. People who got top-tier mentorship in 2015 were told things we now agree aren’t true, but that were consistent with the articles available at the time. People who got top-tier mentorship in 2020 got different advice (I assume, I haven’t kept up since covid started), but how much better was it, in terms of knowledge, than the articles available?
I could definitely buy that EA has a shortage of mysterious old wizards, though.
Here I think the distinction matters, because it’s a load-bearing part of the argument that mentorship is a bottleneck for EA specifically
I intended that more as an illustrative example than the key piece of evidence. (I think I’ve gotten tons of advice that was nuanced and wasn’t well written up in the EA sphere, and depended on someone correcting my misunderstandings)
I think there’s generally a lag time of 2ish years between someone having a clear sense of the advice they give people, and that advice getting written up. For example, my previous post here:
That’s basically the advice I’d have given someone for 1-2 years prior to posting it. Meanwhile at the time I posted it I also had the Mysterious Old Wizard bottleneck formulated in my head, but didn’t publish for another 1.5 years.
(possibly out of date now, not sure if Critch still endorses it), which I was able to write because Critch and I were in a shared social-context at the time, and I got to overhear him saying things. But he would never have gotten around to writing it on his own. And it still took me maybe 7 months to go from “oh I could have written this up in a blogpost” to “it’s actually written up.” And then the post is still optimized for addressing my particular misconceptions, which actually involved a lot of back-and-forth at the time.
And I think the world is actually changing fairly rapidly, and our best understanding of what-people-should-do changes with it, so being 2 years out of date is pretty bad.
I disagree with your interpretation of what happened with respect to talent constraints. In addition, I have a meta-critique. In your hypothetical people talk about ‘talent constraints’ without citing any articles. But you don’t cite any articles either!
I think quite a lot of the basis for ’EA is talent constrained came from the 80K hours surveys. 2017 2018. Both surveys were quite detailed and cannot be quickly summarized. But the 2017 report literally says the following:
‘EA is talent constrained’ was definitely not just a mistranslation, the explicit concept of ‘talent constrained’ vs ‘funding constrained’ was used on official surveys as recently as 2017. The surveys also make it clear that organizations considered keeping their recent hires to be worth seemingly incredible amounts of money. They were asked:
Responses in 2017 and 2018 respectively for Senior/Junior hires:
80k Hours explicitly agrees the surveys were misleading for many readers and they will try to do better in the future. So there is no need to harp on them. But the meta point seems important here.
Oh, whoops! I totally meant the post to directly link to this post:
https://80000hours.org/2015/11/why-you-should-focus-more-on-talent-gaps-not-funding-gaps/
Which I have now properly edited in.
Not sure that that changes the thrust of your critique but that was a straightforward error on my part.
I probably agree with (or at least, would not be surprised if), 80k or CEA had also communicated poorly in contexts other than the initial post. (I also think there’s still plenty of room for the initial post to have been more clear).
FWIW I also think your summary of the 2015 article is inaccurate. For example, “EA needs very specific talents that are missing.” isn’t consistent with the section titled “Less Earning to Give”, which states very clearly that more than 20% of EAs, total, should be doing direct work. “EA needs lots of generally talented people” is a much better fit. My own experiences are consistent with that: the people I know who got career advice from 80k or other EA thought leaders in that era were all told to do direct work, typically operations at EA orgs.
Normally this wouldn’t be worth talking about; who really cares whether an article from 2015 was unclear, or clearly communicated something its authors now disagree with? Here I think the distinction matters, because it’s a load-bearing part of the argument that mentorship is a bottleneck for EA specifically. People who got top-tier mentorship in 2015 were told things we now agree aren’t true, but that were consistent with the articles available at the time. People who got top-tier mentorship in 2020 got different advice (I assume, I haven’t kept up since covid started), but how much better was it, in terms of knowledge, than the articles available?
I could definitely buy that EA has a shortage of mysterious old wizards, though.
I intended that more as an illustrative example than the key piece of evidence. (I think I’ve gotten tons of advice that was nuanced and wasn’t well written up in the EA sphere, and depended on someone correcting my misunderstandings)
I think there’s generally a lag time of 2ish years between someone having a clear sense of the advice they give people, and that advice getting written up. For example, my previous post here:
https://forum.effectivealtruism.org/posts/HBKb3Y5mvb69PRHvP/dealing-with-network-constraints-my-model-of-ea-careers
That’s basically the advice I’d have given someone for 1-2 years prior to posting it. Meanwhile at the time I posted it I also had the Mysterious Old Wizard bottleneck formulated in my head, but didn’t publish for another 1.5 years.
Then there’s a post like this:
https://www.lesswrong.com/posts/HnC29723hm6kJT7KP/taking-ai-risk-seriously-thoughts-by-critch
(possibly out of date now, not sure if Critch still endorses it), which I was able to write because Critch and I were in a shared social-context at the time, and I got to overhear him saying things. But he would never have gotten around to writing it on his own. And it still took me maybe 7 months to go from “oh I could have written this up in a blogpost” to “it’s actually written up.” And then the post is still optimized for addressing my particular misconceptions, which actually involved a lot of back-and-forth at the time.
And I think the world is actually changing fairly rapidly, and our best understanding of what-people-should-do changes with it, so being 2 years out of date is pretty bad.