It seems to me that Givewell has already acknowledged perfectly well that VillageReach is not a top effective charity. It also seems to me that there’s lots of reasons one might take GiveWell’s recommendations seriously, and that getting “particularly horrified” about their decision not to research exactly how much impact their wrong choice didn’t have is a rather poor way to conduct any sort of inquiry on the accuracy of organizations’ decisions.
It was very much not obvious to me that GiveWell doubted its original VillageReach recommendation until I emailed. What published information made this obvious to you?
The main explanation I could find for taking VillageReach off the Top Charities list was that they no longer had room for more funding. At the time I figured this simply meant they’d finished scaling up inside the country and didn’t have more work to do of the kind that earned the Top Charity recommendation.
We are also more deeply examining the original evidence of effectiveness for VillageReach’s pilot project. Our standards for evidence continue to rise, and our re-examination has raised significant questions that we intend to pursue in the coming months.
I had donated to VillageReach due to GiveWell’s endorsement, and I found it moderately easy to notice that they had changed more than just the room for funding conclusion.
That update does seem straightforward, thanks for finding it. I see how people following the GiveWell blog at the time would have a good chance of noticing this. I wish it had been easier to find for people trying to do retrospectives.
In 2012 they said that they considered VillageReach a good bet that didn’t pay off. I’m not sure whether they said so online (the thing I’m remembering was at the first Effective Altruism Summit)
pcm’s comment found a blog post saying something consistent. Seems weird that there’s no corresponding asterisk signifying deprecation on their Impact page, or any clear statement of this on the charity page, then.
It seems to me that Givewell has already acknowledged perfectly well that VillageReach is not a top effective charity. It also seems to me that there’s lots of reasons one might take GiveWell’s recommendations seriously, and that getting “particularly horrified” about their decision not to research exactly how much impact their wrong choice didn’t have is a rather poor way to conduct any sort of inquiry on the accuracy of organizations’ decisions.
It was very much not obvious to me that GiveWell doubted its original VillageReach recommendation until I emailed. What published information made this obvious to you?
The main explanation I could find for taking VillageReach off the Top Charities list was that they no longer had room for more funding. At the time I figured this simply meant they’d finished scaling up inside the country and didn’t have more work to do of the kind that earned the Top Charity recommendation.
From http://blog.givewell.org/2012/03/26/villagereach-update/:
I had donated to VillageReach due to GiveWell’s endorsement, and I found it moderately easy to notice that they had changed more than just the room for funding conclusion.
That update does seem straightforward, thanks for finding it. I see how people following the GiveWell blog at the time would have a good chance of noticing this. I wish it had been easier to find for people trying to do retrospectives.
In 2012 they said that they considered VillageReach a good bet that didn’t pay off. I’m not sure whether they said so online (the thing I’m remembering was at the first Effective Altruism Summit)
pcm’s comment found a blog post saying something consistent. Seems weird that there’s no corresponding asterisk signifying deprecation on their Impact page, or any clear statement of this on the charity page, then.
I guess, but I feel like you’re reading into a lot of this anti-charitably.
I think it is a lot of effort and probably not the best use of effort to be transparent in all the ways you are asking.