My guess is that Eli is referring to the fact that the EA community seems to largely donate to where GiveWell says to donate, and that a lot of the discourse is centered around a system of trying to figure out all of the effects of a particular intervention, weigh it against all other factors, and then come up with a plan of what to do, where said plan is incredibly sensitive to you being right about the prioritization, facts about the situation, etc. in a way that will cause you to predictably fail to do as well as you could, due to factors like lack of on-the-ground feedback suggesting other important areas, misunderstanding people’s values, errors in reasoning, and a lack of diversity in attempts to do something so that if one of the parts fails nothing gets accomplished.
I tend to think that global health is relatively non-controversial as a broad goal (nobody wants malaria! like, actually nobody) that doesn’t suffer from the “we’re figuring out what other people value” problem as much as other things, but I also think that that’s almost certainly not the most important thing for people to be dealing with now to the exclusion of all else, and lots of people in the EA community seem to hold similar views.
I also think that GiveWell is much better and handling that type of issue than people in the EA community are, but that (at least the facebook group) is somewhat slow to catch up.
My guess is that Eli is referring to the fact that the EA community seems to largely donate to where GiveWell says to donate, and that a lot of the discourse is centered around a system of trying to figure out all of the effects of a particular intervention, weigh it against all other factors, and then come up with a plan of what to do, where said plan is incredibly sensitive to you being right about the prioritization, facts about the situation, etc. in a way that will cause you to predictably fail to do as well as you could, due to factors like lack of on-the-ground feedback suggesting other important areas, misunderstanding people’s values, errors in reasoning, and a lack of diversity in attempts to do something so that if one of the parts fails nothing gets accomplished.
I tend to think that global health is relatively non-controversial as a broad goal (nobody wants malaria! like, actually nobody) that doesn’t suffer from the “we’re figuring out what other people value” problem as much as other things, but I also think that that’s almost certainly not the most important thing for people to be dealing with now to the exclusion of all else, and lots of people in the EA community seem to hold similar views.
I also think that GiveWell is much better and handling that type of issue than people in the EA community are, but that (at least the facebook group) is somewhat slow to catch up.