GiveWell reanalyzed the data it based its recommendations on, but hasn’t published an after-the-fact retrospective of long-run results. I asked GiveWell about this by email. The response was that such an assessment was not prioritized because GiveWell had found implementation problems in VillageReach’s scale-up work as well as reasons to doubt its original conclusion about the impact of the pilot program.
This seems particularly horrifying; if everyone already knows that you’re incentivized to play up the effectiveness of the charities you’re recommending, then deciding to not check back on a charity you’ve recommended for the explicit reason that you know you’re unable to show that something went well when you predicted it would is a very bad sign; that should be a reason to do the exact opposite thing, i.e. going back and actually publishing an after-the-fact retrospective of long-run results. If anyone was looking for more evidence on whether or not they should take GiveWell’s recommendations seriously, then, well, here they are.
It seems to me that Givewell has already acknowledged perfectly well that VillageReach is not a top effective charity. It also seems to me that there’s lots of reasons one might take GiveWell’s recommendations seriously, and that getting “particularly horrified” about their decision not to research exactly how much impact their wrong choice didn’t have is a rather poor way to conduct any sort of inquiry on the accuracy of organizations’ decisions.
It was very much not obvious to me that GiveWell doubted its original VillageReach recommendation until I emailed. What published information made this obvious to you?
The main explanation I could find for taking VillageReach off the Top Charities list was that they no longer had room for more funding. At the time I figured this simply meant they’d finished scaling up inside the country and didn’t have more work to do of the kind that earned the Top Charity recommendation.
We are also more deeply examining the original evidence of effectiveness for VillageReach’s pilot project. Our standards for evidence continue to rise, and our re-examination has raised significant questions that we intend to pursue in the coming months.
I had donated to VillageReach due to GiveWell’s endorsement, and I found it moderately easy to notice that they had changed more than just the room for funding conclusion.
That update does seem straightforward, thanks for finding it. I see how people following the GiveWell blog at the time would have a good chance of noticing this. I wish it had been easier to find for people trying to do retrospectives.
In 2012 they said that they considered VillageReach a good bet that didn’t pay off. I’m not sure whether they said so online (the thing I’m remembering was at the first Effective Altruism Summit)
pcm’s comment found a blog post saying something consistent. Seems weird that there’s no corresponding asterisk signifying deprecation on their Impact page, or any clear statement of this on the charity page, then.
This seems particularly horrifying; if everyone already knows that you’re incentivized to play up the effectiveness of the charities you’re recommending, then deciding to not check back on a charity you’ve recommended for the explicit reason that you know you’re unable to show that something went well when you predicted it would is a very bad sign; that should be a reason to do the exact opposite thing, i.e. going back and actually publishing an after-the-fact retrospective of long-run results. If anyone was looking for more evidence on whether or not they should take GiveWell’s recommendations seriously, then, well, here they are.
It seems to me that Givewell has already acknowledged perfectly well that VillageReach is not a top effective charity. It also seems to me that there’s lots of reasons one might take GiveWell’s recommendations seriously, and that getting “particularly horrified” about their decision not to research exactly how much impact their wrong choice didn’t have is a rather poor way to conduct any sort of inquiry on the accuracy of organizations’ decisions.
It was very much not obvious to me that GiveWell doubted its original VillageReach recommendation until I emailed. What published information made this obvious to you?
The main explanation I could find for taking VillageReach off the Top Charities list was that they no longer had room for more funding. At the time I figured this simply meant they’d finished scaling up inside the country and didn’t have more work to do of the kind that earned the Top Charity recommendation.
From http://blog.givewell.org/2012/03/26/villagereach-update/:
I had donated to VillageReach due to GiveWell’s endorsement, and I found it moderately easy to notice that they had changed more than just the room for funding conclusion.
That update does seem straightforward, thanks for finding it. I see how people following the GiveWell blog at the time would have a good chance of noticing this. I wish it had been easier to find for people trying to do retrospectives.
In 2012 they said that they considered VillageReach a good bet that didn’t pay off. I’m not sure whether they said so online (the thing I’m remembering was at the first Effective Altruism Summit)
pcm’s comment found a blog post saying something consistent. Seems weird that there’s no corresponding asterisk signifying deprecation on their Impact page, or any clear statement of this on the charity page, then.
I guess, but I feel like you’re reading into a lot of this anti-charitably.
I think it is a lot of effort and probably not the best use of effort to be transparent in all the ways you are asking.