If on average aid has not contributed to economic growth, and the best foreign aid charities positively contribute a lot to economic growth, then as many other foreign aid charities negatively contribute a lot to economic growth
My impression is that the situation is closer to a very large majority having small negative impact and a very small minority having a large positive impact.
people cannot tell them apart (if they could, they would definitely shift their contributions).
The reason that people cannot tell them apart is that they’re putting essentially no effort into doing so. According to the recent Money for Good study only $4.1 billion of the $300 billion donated mentioned in the above was donated by donors who do research comparing multiple charities when deciding where to give. It’s plausible that donors who make an active effort to maximize the positive effects and minimize the negative effects of their donations can do far better than the average donor.
what terms of the bet are you proposing as I’d take it if it wasn’t for difficulty of measurement.
I’m not literally proposing a bet; I’m just saying that while it could be that donating to charities like Deworm the World and VillageReach doesn’t have a positive impact on economic growth, I judge the expected value to be moderately positive and I don’t see any reason to think otherwise.This is in line with MassDriver’s comment
honestly I find it hard to even articulate a hypothesis on which, e.g., de-worming initiatives don’t foster economic growth. I wouldn’t be starting many local businesses if my brain couldn’t get calories out of my gruel because they went to a tapeworm first.
There are plausible explanations for why the net effect of aid has been trivial that don’t preclude the hypothesis that the interventions under discussion are effective.
According to the recent Money for Good study only $4.1 billion of the $300 billion donated mentioned in the above was donated by donors who do research comparing multiple charities when deciding where to give.
This implies that GiveWell is much better charitable cause than Village Reach.
In any case, all of my charitable budget goes towards provision of public goods—this has clear large net positive effect, while alleviating suffering would only have positive effect under some rather strong assumption about how well informed I am.
I haven’t donated anything to CPC yet (other than a few throwaway comments about how remarkable their performance has been, I tend to do that for things I like and it’s hardly much of “charity”). I consider this a very interesting idea, but I’d like someone else to verify that it makes sense.
Actually, the situation is probably quite a bit worse than the $4.1 billion figure that I cited suggests: “doing research comparing multiple charities” probably entails visiting several charities websites and/or referring to charity watchdog organizations which rate charities on financials rather than impact.
This implies that GiveWell is much better charitable cause than Village Reach.
If one ignores signaling/incentive effects then I agree.
Up until this point, GiveWell has been focusing on attracting donations for its recommended charities rather than soliciting money for itself. The more money GiveWell moves the more influence it will have subsequently. Whether or not donating to GiveWell’s recommended charities is genuinely a good way to support GiveWell is unclear to me; but what I’ve done so far on their recommendation.
I think that their thinking has been that they want to prove that they’re doing something tangibly useful by directing more money to their recommended charities before fundraising for themselves. Presumably this comes from their emphasis on proven programs.
I personally would like to see them shift toward evaluating charities like Wikipedia, etc. for which it’s more difficult to assess the impact but which have potentially very high expected value.
In any case, all of my charitable budget goes towards provision of public goods—this has clear large net positive effect, while alleviating suffering would only have positive effect under some rather strong assumption about how well informed I am.
Sure, makes sense. If you’re interested I’d encourage you to fill out GiveWell’s survey—this could influence what causes they look into next and help you optimize your public goods donations. They’ve been going where the interest is, presumably in an effort to gain broader traction (e.g. they started looking into disaster relief as a cause in response to receiving a number of queries from prospective donors).
I haven’t donated anything to CPC yet (other than a few throwaway comments about how remarkable their performance has been, I tend to do that for things I like and it’s hardly much of “charity”). I consider this a very interesting idea, but I’d like someone else to verify that it makes sense.
Interesting :-). Is the CPC accepting donations?
Maybe better still would be to fund a (hypothetical) advocacy group that offers the CPC money in exchange for greater openness / freedom of speech in China (potentially leading to simultaneous progress on two fronts at once)? (This idea presupposes that straightforwardly increased civil rights in China would not indirectly reduce its economic growth; an assumption which admittedly may not be valid.)
It is clear to me that the real efficient charitable cause is rationality itself. Givewell is giving money to VillageReach as a way of proving to stupid, irrational people that efficient charity is better than random charity. (Duh).
But if you could find a way to make rationality more widely accepted, even by a tiny amount, then you would incrementally solve the “efficient charity” problem along with a host of others, including existential risk, lack of life-extension advocacy, etc etc.
But if you could find a way to make rationality more widely accepted, even by a tiny amount, then you would incrementally solve the “efficient charity” problem along with a host of others, including existential risk, lack of life-extension advocacy, etc etc.
Beware the fallacy of the drunkard who looks for his keys under the streetlight rather than in the alley where he dropped them, because “the light is better here”
Sure, I’m not saying that one shouldn’t try. Several points here:
My observation has been that there’s a tendency for people with lower innate levels of rationality who are exposed to rationality to adopt “rationality as attire” analogous to Science As Attire. People can nominally become more rational without this having a deep impact on them, and this can give rise to an illusion that raising levels of rationality is easier than it actually is. I have limited data and the relative significance of this factor is unclear to me.
I’d certainly be interested in brainstorming with you about ways to raise the global standard for rationality.
Concerning easy accessible projects vs. difficult inaccessible projects: I think that as a heuristic younger people should aim for smaller successes to develop a credible track record to leverage toward later more ambitious goals.
I think that it might help if one could make rationality look more like a way to win and less like a cult(ure) of self-sacrifice and loserdom. This is a big problem have with LW: it generates Losers, not winners. Yes, generous losers who want to help others, but losers nontheless. An ideology that makes you into a loser (no matter how generous a loser) is going to sell like warm dog-poo.
Maybe it would be possible to turn a branch of rationality into a machine that outputs people who are “winners” according to a diverse set of already-acepted standards of winningness. I.e. not “how much has this person helped random strangers”, rather “has this person got an expensive car” “has this person got an active social life”, “has this person got a hot partner”, “does this person give off signals of high-status” etc.
I think it’s hard to even imagine rationality as popular because what we have here is so different from what could ever be popular.
Maybe there simply isn’t a way for one to use epistemic rationality to generate winning people. Maybe the only way to reap any reward from rationality is to have a whole society simultaneously adopt it, producing an irrationality/collective-action-problem catch 22 which will be the end of us all.
( irrationality/collective-action-problem catch 22 = can’t make anyone rational without solving important collective action problems, can’t solve important collective action problems without most people being rational. Hence impasse, stupid, fail, die. )
An ideology that makes you into a loser (no matter how generous a loser) is going to sell like warm dog-poo.
Huh. This statement just seems wrong on its face. For example, Christianity is a fairly popular ideology, and it at least seems to “make you into a generous loser” in the sense you mean that here.
Possibly I don’t understand Christianity properly… or maybe I don’t understand what you mean here.
Or maybe there was an implicit “will sell like warm dog-poo [within the community we’re talking about]” and I’ve lost track of context.
I mean, sure, I agree that if you’re primarily concerned with people who primarily want expensive cars, showing how rationality leads to having expensive cars is definitely the way to go. (Ditto for social life, hot partners, status and so forth.)
The intersection of ideology and identity is all about defining winners to include you. Most ideologies have someone to look down on for that exact reason- we’re winners because we’re not X. I recently started listening to country music quite a bit, and it is somewhat amazing the number of songs that profess a preference for being poorer/simpler/etc, but it makes perfect sense when you imagine it as them redefining success to exclude people that own mansions but don’t have time to go to the fishin hole. (Side note: the rich people I know that like to fish regularly go fishing.)
And so it seems to me that LW’s brand of “not only should you be an altruist, but you should be a particular kind of altruist that doesn’t get warm fuzzies” will sell like warm dog poo, because that’s only barely about rationality. Even standard rationality- the “I’m often wrong but I try to be less wrong”- only sells to the analog of theologians among the religious. Christianity works for both the people who want the social club and to look down on the unsaved and for the people who want personal growth (and to look down at those who don’t get it). But generally speaking the first group is larger than the second group- and we only appeal to the second group.
Do people move from one group to the other? Yes, of course. (Unfortunately, it goes both ways.) Should we fret about how many people would be attracted to the stuff we do? Honestly, I don’t see why. One could get some validation that other people like it, or some validation that other people don’t like it. But rationality is fundamentally an individual thing and it should provide individual benefits. Turning it into a political or social movement introduces all the problems inherent with political or social movements- and it seems better to just live so well other people ask you what you’re doing.
As far as I can tell, “rationalism” as a social movement actually does pretty well on the “people who want the social club and to look down on the unsaved” front among people who identify as smart (where the “unsaved” equivalent is “people not as smart as us”), and not so well among those who don’t.
In any case: yeah, if one doesn’t want to “sell” it in the first place, one’s problems become simpler.
Christianity is a worthy counterexample. But note that in the developed world, it is massively in retreat, i.e. on a level playing field where christianity started today with the same number of initial members as LW has, it would die.
On the other hand something like scientology has actually suceeded in growing from a tiny base, so maybe that’s a stronger counterexample. But note that scientology sells itself a lot on helping people with personal development, i.e. winning. As does christianity to some extent, especially brands of christianity that are actually succeeding in attracting new members.
In conclusion I think that succesful religions actually excel at offering the recruit some short-term gains in winningness. In the case of christianity at least, I think that the gains are permanent for many people.
Thinking about it, it seems that the need for a personal development focussed ideology is obviously very strong. I mean geez, if people need that so badly that they’re prepared to believe utter bullshit in order to get it, then there must be a strong need for it.
(nods slowly) Yeah, you’re right: I can’t think of any “loser-making” ideologies that are growing in popularity compared to prosperity theology.
OK, I stand corrected, at least as applied to the modern world.
One problem, as has been discussed many times, is working out a delivery vehicle for “rationality as a way to get good stuff from the world” that doesn’t have its lunch eaten by “the trappings of rationality as a way to get good stuff from gullible people.”
To solve that problem in the context of personal development, we need short-term gains that swamp the placebo effects that hucksters offer.
Which is a tricky problem, because the placebo effects are actually pretty darned compelling: increasing confidence and subverting people’s self-sabotage techniques really do get a lot of win right off the bat in the areas people normally think of for personal development (getting a raise, a better job, making friends, mood maintenance, weight loss, etc.).
My own instinct would be to start in a market where there isn’t a strong established antirationalist competitor, that isn’t primarily social (thus less readily swamped by the effects of charisma), and that is genuinely difficult (such that a good approach quickly generates noticeably better results than a poor one).
The one that jumps out at me is personal finance. A reliable rational technique for substantially outperforming the market in a 3-6 month timeframe would be a pretty good hook.
“Personal finance” to me has meant “reliable tweaks to optimise current methods of making money”. So, less make money, and more decrease waste of made money.
Not “How to get rich quick”, but “How to be a little richer than you are”
How to make money, how to spend less than you make, how to get the stuff you want for less money, how to make reliable plans for having enough money in the future (e.g., “planning for retirement”).
Hm. That raises an interesting potential point of differentiation, actually.
I’ve seen a lot of “make money” guides that spin themselves as personal development plans… “change your attitude, use these techniques, and you’ll be powerful and successful and popular” and so forth.
Which is unsurprising; this is how one creates followers.
Taking instead the tactic of “Your attitude doesn’t matter. Do these things, and you’ll get positive results. We’re talking about the reality outside your head, here.” might help inhibit subversion by magical thinking.
I think that it might help if one could make rationality look more like a way to win and less like a cult(ure) of self-sacrifice and loserdom. This is a big problem have with LW: it generates Losers, not winners. Yes, generous losers who want to help others, but losers nontheless. An ideology that makes you into a loser (no matter how generous a loser) is going to sell like warm dog-poo.
Spot on! There is an awful lot of social pressure here that has absolutely nothing to do with behaving rationally—in fact, some of it is directly opposed.
My observation has been that there’s a tendency for people with lower innate levels of rationality who are exposed to rationality to adopt “rationality as attire” analogous to Science As Attire. People can nominally become more rational without this having a deep impact on them, and this can give rise to an illusion raising levels of rationality is easier than it actually is. I have limited data and the relative significance of this factor is unclear to me.
people cannot tell them apart (if they could, they would definitely shift their contributions).
Also, we know much of the aid has been done with knowledge that it would cause harm, and designed to be stolen/abused, because it was being used as bribes for nasty regimes in geopolitical contests. That can provide a sizable chunk of the “negative” effect to balance out positives.
My impression is that the situation is closer to a very large majority having small negative impact and a very small minority having a large positive impact.
The reason that people cannot tell them apart is that they’re putting essentially no effort into doing so. According to the recent Money for Good study only $4.1 billion of the $300 billion donated mentioned in the above was donated by donors who do research comparing multiple charities when deciding where to give. It’s plausible that donors who make an active effort to maximize the positive effects and minimize the negative effects of their donations can do far better than the average donor.
I’m not literally proposing a bet; I’m just saying that while it could be that donating to charities like Deworm the World and VillageReach doesn’t have a positive impact on economic growth, I judge the expected value to be moderately positive and I don’t see any reason to think otherwise.This is in line with MassDriver’s comment
There are plausible explanations for why the net effect of aid has been trivial that don’t preclude the hypothesis that the interventions under discussion are effective.
This implies that GiveWell is much better charitable cause than Village Reach.
In any case, all of my charitable budget goes towards provision of public goods—this has clear large net positive effect, while alleviating suffering would only have positive effect under some rather strong assumption about how well informed I am.
I haven’t donated anything to CPC yet (other than a few throwaway comments about how remarkable their performance has been, I tend to do that for things I like and it’s hardly much of “charity”). I consider this a very interesting idea, but I’d like someone else to verify that it makes sense.
Upvoted.
Actually, the situation is probably quite a bit worse than the $4.1 billion figure that I cited suggests: “doing research comparing multiple charities” probably entails visiting several charities websites and/or referring to charity watchdog organizations which rate charities on financials rather than impact.
If one ignores signaling/incentive effects then I agree.
Up until this point, GiveWell has been focusing on attracting donations for its recommended charities rather than soliciting money for itself. The more money GiveWell moves the more influence it will have subsequently. Whether or not donating to GiveWell’s recommended charities is genuinely a good way to support GiveWell is unclear to me; but what I’ve done so far on their recommendation.
I think that their thinking has been that they want to prove that they’re doing something tangibly useful by directing more money to their recommended charities before fundraising for themselves. Presumably this comes from their emphasis on proven programs.
I personally would like to see them shift toward evaluating charities like Wikipedia, etc. for which it’s more difficult to assess the impact but which have potentially very high expected value.
Sure, makes sense. If you’re interested I’d encourage you to fill out GiveWell’s survey—this could influence what causes they look into next and help you optimize your public goods donations. They’ve been going where the interest is, presumably in an effort to gain broader traction (e.g. they started looking into disaster relief as a cause in response to receiving a number of queries from prospective donors).
Interesting :-). Is the CPC accepting donations?
Maybe better still would be to fund a (hypothetical) advocacy group that offers the CPC money in exchange for greater openness / freedom of speech in China (potentially leading to simultaneous progress on two fronts at once)? (This idea presupposes that straightforwardly increased civil rights in China would not indirectly reduce its economic growth; an assumption which admittedly may not be valid.)
It is clear to me that the real efficient charitable cause is rationality itself. Givewell is giving money to VillageReach as a way of proving to stupid, irrational people that efficient charity is better than random charity. (Duh).
But if you could find a way to make rationality more widely accepted, even by a tiny amount, then you would incrementally solve the “efficient charity” problem along with a host of others, including existential risk, lack of life-extension advocacy, etc etc.
Agree, but easier said than done :-).
Beware the fallacy of the drunkard who looks for his keys under the streetlight rather than in the alley where he dropped them, because “the light is better here”
Sure, I’m not saying that one shouldn’t try. Several points here:
My observation has been that there’s a tendency for people with lower innate levels of rationality who are exposed to rationality to adopt “rationality as attire” analogous to Science As Attire. People can nominally become more rational without this having a deep impact on them, and this can give rise to an illusion that raising levels of rationality is easier than it actually is. I have limited data and the relative significance of this factor is unclear to me.
I’d certainly be interested in brainstorming with you about ways to raise the global standard for rationality.
Concerning easy accessible projects vs. difficult inaccessible projects: I think that as a heuristic younger people should aim for smaller successes to develop a credible track record to leverage toward later more ambitious goals.
I think that it might help if one could make rationality look more like a way to win and less like a cult(ure) of self-sacrifice and loserdom. This is a big problem have with LW: it generates Losers, not winners. Yes, generous losers who want to help others, but losers nontheless. An ideology that makes you into a loser (no matter how generous a loser) is going to sell like warm dog-poo.
Maybe it would be possible to turn a branch of rationality into a machine that outputs people who are “winners” according to a diverse set of already-acepted standards of winningness. I.e. not “how much has this person helped random strangers”, rather “has this person got an expensive car” “has this person got an active social life”, “has this person got a hot partner”, “does this person give off signals of high-status” etc.
I think it’s hard to even imagine rationality as popular because what we have here is so different from what could ever be popular.
Maybe there simply isn’t a way for one to use epistemic rationality to generate winning people. Maybe the only way to reap any reward from rationality is to have a whole society simultaneously adopt it, producing an irrationality/collective-action-problem catch 22 which will be the end of us all.
( irrationality/collective-action-problem catch 22 = can’t make anyone rational without solving important collective action problems, can’t solve important collective action problems without most people being rational. Hence impasse, stupid, fail, die. )
Huh. This statement just seems wrong on its face. For example, Christianity is a fairly popular ideology, and it at least seems to “make you into a generous loser” in the sense you mean that here.
Possibly I don’t understand Christianity properly… or maybe I don’t understand what you mean here.
Or maybe there was an implicit “will sell like warm dog-poo [within the community we’re talking about]” and I’ve lost track of context.
I mean, sure, I agree that if you’re primarily concerned with people who primarily want expensive cars, showing how rationality leads to having expensive cars is definitely the way to go. (Ditto for social life, hot partners, status and so forth.)
It is and it isn’t.
The intersection of ideology and identity is all about defining winners to include you. Most ideologies have someone to look down on for that exact reason- we’re winners because we’re not X. I recently started listening to country music quite a bit, and it is somewhat amazing the number of songs that profess a preference for being poorer/simpler/etc, but it makes perfect sense when you imagine it as them redefining success to exclude people that own mansions but don’t have time to go to the fishin hole. (Side note: the rich people I know that like to fish regularly go fishing.)
And so it seems to me that LW’s brand of “not only should you be an altruist, but you should be a particular kind of altruist that doesn’t get warm fuzzies” will sell like warm dog poo, because that’s only barely about rationality. Even standard rationality- the “I’m often wrong but I try to be less wrong”- only sells to the analog of theologians among the religious. Christianity works for both the people who want the social club and to look down on the unsaved and for the people who want personal growth (and to look down at those who don’t get it). But generally speaking the first group is larger than the second group- and we only appeal to the second group.
Do people move from one group to the other? Yes, of course. (Unfortunately, it goes both ways.) Should we fret about how many people would be attracted to the stuff we do? Honestly, I don’t see why. One could get some validation that other people like it, or some validation that other people don’t like it. But rationality is fundamentally an individual thing and it should provide individual benefits. Turning it into a political or social movement introduces all the problems inherent with political or social movements- and it seems better to just live so well other people ask you what you’re doing.
As far as I can tell, “rationalism” as a social movement actually does pretty well on the “people who want the social club and to look down on the unsaved” front among people who identify as smart (where the “unsaved” equivalent is “people not as smart as us”), and not so well among those who don’t.
In any case: yeah, if one doesn’t want to “sell” it in the first place, one’s problems become simpler.
Christianity is a worthy counterexample. But note that in the developed world, it is massively in retreat, i.e. on a level playing field where christianity started today with the same number of initial members as LW has, it would die.
On the other hand something like scientology has actually suceeded in growing from a tiny base, so maybe that’s a stronger counterexample. But note that scientology sells itself a lot on helping people with personal development, i.e. winning. As does christianity to some extent, especially brands of christianity that are actually succeeding in attracting new members.
In conclusion I think that succesful religions actually excel at offering the recruit some short-term gains in winningness. In the case of christianity at least, I think that the gains are permanent for many people.
Thinking about it, it seems that the need for a personal development focussed ideology is obviously very strong. I mean geez, if people need that so badly that they’re prepared to believe utter bullshit in order to get it, then there must be a strong need for it.
(nods slowly) Yeah, you’re right: I can’t think of any “loser-making” ideologies that are growing in popularity compared to prosperity theology.
OK, I stand corrected, at least as applied to the modern world.
One problem, as has been discussed many times, is working out a delivery vehicle for “rationality as a way to get good stuff from the world” that doesn’t have its lunch eaten by “the trappings of rationality as a way to get good stuff from gullible people.”
To solve that problem in the context of personal development, we need short-term gains that swamp the placebo effects that hucksters offer.
Which is a tricky problem, because the placebo effects are actually pretty darned compelling: increasing confidence and subverting people’s self-sabotage techniques really do get a lot of win right off the bat in the areas people normally think of for personal development (getting a raise, a better job, making friends, mood maintenance, weight loss, etc.).
My own instinct would be to start in a market where there isn’t a strong established antirationalist competitor, that isn’t primarily social (thus less readily swamped by the effects of charisma), and that is genuinely difficult (such that a good approach quickly generates noticeably better results than a poor one).
The one that jumps out at me is personal finance. A reliable rational technique for substantially outperforming the market in a 3-6 month timeframe would be a pretty good hook.
What do you mean by personal finance? You mean how to make money ?
“Personal finance” to me has meant “reliable tweaks to optimise current methods of making money”. So, less make money, and more decrease waste of made money.
Not “How to get rich quick”, but “How to be a little richer than you are”
How to make money, how to spend less than you make, how to get the stuff you want for less money, how to make reliable plans for having enough money in the future (e.g., “planning for retirement”).
Yeah. Sounds like a good first target. Though note link to personal development, which itself links to social dev.
Hm. That raises an interesting potential point of differentiation, actually.
I’ve seen a lot of “make money” guides that spin themselves as personal development plans… “change your attitude, use these techniques, and you’ll be powerful and successful and popular” and so forth.
Which is unsurprising; this is how one creates followers.
Taking instead the tactic of “Your attitude doesn’t matter. Do these things, and you’ll get positive results. We’re talking about the reality outside your head, here.” might help inhibit subversion by magical thinking.
Spot on! There is an awful lot of social pressure here that has absolutely nothing to do with behaving rationally—in fact, some of it is directly opposed.
“Rationality as attire” can actually win really well because it takes so little to do better than most people. There’s a lot of low-hanging fruit.
Guilty! ;)
Also, we know much of the aid has been done with knowledge that it would cause harm, and designed to be stolen/abused, because it was being used as bribes for nasty regimes in geopolitical contests. That can provide a sizable chunk of the “negative” effect to balance out positives.