You have a stronger case there, since they’re less unusual than MIRI, but I think my point still applies: most people expert in hiring top mathematicians or computerfolks aren’t used to hiring for charitable work, and most people expert in hiring for charities aren’t used to looking for top mathematicians or computerfolks.
(Also: it’s not clear to me how good we should expect the people who are best at hiring to be, even if their hiring skills are well matched to the particular problem at hand. Consider MIRI specifically for a moment. Suppose it’s true either 1. that MIRI’s arguments for why their work is important and valuable are actually unsound, and the super-bright people they’re trying to hire can see this, although they could be bamboozled into changing their minds by an hour-long conversation with Eliezer Yudkowsky; or 2. that MIRI’s arguments look unsound in a way that’s highly convincing to those super-bright people, although they could be made to see the error of their ways by an hour-long conversation with Eliezer Yudkowsky. Then there’s not much a hiring expert can actually do to help MIRI hire the people it’s looking for, because the same qualities that make them the people MIRI wants also make them not want to work for MIRI, and what it takes to change their mind is skills and information someone who specializes in being good at hiring won’t have. Of course this is a highly contrived example, but less extreme versions of it seem somewhat plausible.)
Object level I suspect EA wouldn’t be hiring technical people directly. More like finding PIs who would hire teams to do certain things. There are many good PIs who don’t mesh well with academia since academia selects for things uncorrelated and in some cases anti correlated with good science. Meta level I don’t think we need to determine these things ourselves because this is exactly the sort of consideration that I want to hire for the experience of evaluating and executing on.
You have a stronger case there, since they’re less unusual than MIRI, but I think my point still applies: most people expert in hiring top mathematicians or computerfolks aren’t used to hiring for charitable work, and most people expert in hiring for charities aren’t used to looking for top mathematicians or computerfolks.
(Also: it’s not clear to me how good we should expect the people who are best at hiring to be, even if their hiring skills are well matched to the particular problem at hand. Consider MIRI specifically for a moment. Suppose it’s true either 1. that MIRI’s arguments for why their work is important and valuable are actually unsound, and the super-bright people they’re trying to hire can see this, although they could be bamboozled into changing their minds by an hour-long conversation with Eliezer Yudkowsky; or 2. that MIRI’s arguments look unsound in a way that’s highly convincing to those super-bright people, although they could be made to see the error of their ways by an hour-long conversation with Eliezer Yudkowsky. Then there’s not much a hiring expert can actually do to help MIRI hire the people it’s looking for, because the same qualities that make them the people MIRI wants also make them not want to work for MIRI, and what it takes to change their mind is skills and information someone who specializes in being good at hiring won’t have. Of course this is a highly contrived example, but less extreme versions of it seem somewhat plausible.)
Object level I suspect EA wouldn’t be hiring technical people directly. More like finding PIs who would hire teams to do certain things. There are many good PIs who don’t mesh well with academia since academia selects for things uncorrelated and in some cases anti correlated with good science. Meta level I don’t think we need to determine these things ourselves because this is exactly the sort of consideration that I want to hire for the experience of evaluating and executing on.
Yeah, that’s fair; I didn’t adjust my thinking sufficiently when I discovered you were referring to general EA rather than to MIRI-alikes.