[Edit: original source has been found. The way I framed this question was off, since I’d mis-remembered it as “amount that a given grantmaker grants per year” rather than “grants per hour of staff time”. Updated the title]
I recall reading an article once that claimed that, when examining many small and large foundations, it turned out that there was a maximum amount that a given grantmaker typically gave out. And as organizations scaled to give out more money, this amount stayed surprisingly fixed, with a higher overhead ratio than you might have expected.
(i.e. when an org gives out a million a year, it has N grantmakers, and when it gives out 100 million a year, it typically has 100N grantmakers).
I don’t remember the number, or the methodology that determined it. Curious if anyone can remember the article. (It might have been from OpenPhil’s blog, or it might have been some random news site).
I vaguely remember the number “3” being involved, possibly $300k, or $3 million.
The takeaway I remember was something like “you might naively think you can scale up an organization and then give away money more efficiently, but weird forces seem to limit that.”
Does this sound familiar to anyone?
For posterity (esp. because I don’t trust Facebook as a repository of knowledge), quoting in full the original FB post that riceissa linked:
You might have in mind a Facebook post that Vipul Naik wrote in 2017.
Ah, yes. That was it.