One main prescription of the article seems to be “encourage signaling of prosocial goods, so that the cost at least goes to somebody.” I think this does not necessarily work, because if something’s value as a signal depends on its cost and not on its prosocial qualities, then the natural pressure will optimize away from real benefits and towards apparent benefits.
What you get, I think, if you aim for “at least make your signaling do-gooder-ish” is a proliferation of fake do-gooding, which has the real harm of giving actual do-gooding a bad name. I’m not sure that this is net harmful, but I think it’s a matter of real doubt. “Conspicuous philanthropy” has, after all, provided the US with countless Carnegie libraries. But it’s a serious question whether poorly-aimed aid might be net harmful for poor countries. For another example, the Crusades were acts of “conspicuous philanthropy”—European nobles did not profit from them, but committed vast sums to foreign wars out of piety. It’s not at all clear that this was good for the world.
I endorse this concern. I do think it is possible to create social value in this way though, especially for relatively simple activities with good alignment between apparent and real benefits, e.g. transferring money / fungible resources to an agent that is trying to do good, or supplying additional tax revenue. So I think there are at least some equilibria where the benefits significantly overwhelm the negative effects, and indeed are a significant fraction of the total loss to the signaler.
I think that reaching a good equilibrium is especially plausible amongst the rationalists/EAs.
If 90% of the price of a diamond ring goes to an efficient charity, then the ring seems to lose 90% of its signaling value for an EA. Suppose an EA is planning to donate $X or Y% of lifetime income to an efficient charity (believing that to be the optimal balance between selfish and altruistic values), then after buying the ring they would reduce their future donations by 90% of the price of ring, since that would maintain the optimal balance between their values. So the amount of money they “lost” by buying the ring is only 10% of its price, and that would be taken into account by the recipient of the ring and other observers.
Yes, the signaling effect only works for those who don’t much value the social good. If someone is overinvesting in costly signaling then there must be some form of social good they don’t value (namely the welfare of the other people who are engaged in the signaling game). If you align these negative externalities perfectly with the positive externalities you create, then everything works out perfectly (and this is obvious to observers). Otherwise, you are counting on the person not much preferring the beneficiaries of the charity to the losers from the signaling game. That’s often going to fail in the conspicuous philanthropy case, and I’m not sure how to make an alternative that comes closer.
Of course as long as the conspicuous philanthropy is not the absolute most effective philanthropy for the signaler then you could in principle just scale up the signaling costs appropriately. But this introduces lots of extra problems, since by giving you are then mostly signaling that you like charity.
True, but EAs tend not to demand burning money as strongly as society as a whole does, and it won’t loose all it’s effective signaling power to non-EAs unless there’s a lot better flow of information than is typical in society.
For example, a fiancé′s parents might be mortally offended if their child didn’t give and/or receive the usual burnt offering in the form of costly signaling. However, I suspect that in most cases the charity diamond would be acceptable, even if the parents understood EA and knew that the counterfactual amount burnt was only 10% of the standard burnt offering.
After all, what they really care about is their son or daughter in law not embarrassing the family by looking cheap. So, unless all their friends also understand EA, they likely won’t care about whether the offering is donated or burnt, so long as there’s a diamond.
This might not work if the 90% was going to something besides charity, but luckily it’s socially costly to criticize good deeds. (Hence, why EAs have a hard time arguing against donations to curing rare diseases in cute puppies. You look like a dick if you criticize warm fuzzies.)
One main prescription of the article seems to be “encourage signaling of prosocial goods, so that the cost at least goes to somebody.” I think this does not necessarily work, because if something’s value as a signal depends on its cost and not on its prosocial qualities, then the natural pressure will optimize away from real benefits and towards apparent benefits.
What you get, I think, if you aim for “at least make your signaling do-gooder-ish” is a proliferation of fake do-gooding, which has the real harm of giving actual do-gooding a bad name. I’m not sure that this is net harmful, but I think it’s a matter of real doubt. “Conspicuous philanthropy” has, after all, provided the US with countless Carnegie libraries. But it’s a serious question whether poorly-aimed aid might be net harmful for poor countries. For another example, the Crusades were acts of “conspicuous philanthropy”—European nobles did not profit from them, but committed vast sums to foreign wars out of piety. It’s not at all clear that this was good for the world.
I endorse this concern. I do think it is possible to create social value in this way though, especially for relatively simple activities with good alignment between apparent and real benefits, e.g. transferring money / fungible resources to an agent that is trying to do good, or supplying additional tax revenue. So I think there are at least some equilibria where the benefits significantly overwhelm the negative effects, and indeed are a significant fraction of the total loss to the signaler.
I think that reaching a good equilibrium is especially plausible amongst the rationalists/EAs.
If 90% of the price of a diamond ring goes to an efficient charity, then the ring seems to lose 90% of its signaling value for an EA. Suppose an EA is planning to donate $X or Y% of lifetime income to an efficient charity (believing that to be the optimal balance between selfish and altruistic values), then after buying the ring they would reduce their future donations by 90% of the price of ring, since that would maintain the optimal balance between their values. So the amount of money they “lost” by buying the ring is only 10% of its price, and that would be taken into account by the recipient of the ring and other observers.
Yes, the signaling effect only works for those who don’t much value the social good. If someone is overinvesting in costly signaling then there must be some form of social good they don’t value (namely the welfare of the other people who are engaged in the signaling game). If you align these negative externalities perfectly with the positive externalities you create, then everything works out perfectly (and this is obvious to observers). Otherwise, you are counting on the person not much preferring the beneficiaries of the charity to the losers from the signaling game. That’s often going to fail in the conspicuous philanthropy case, and I’m not sure how to make an alternative that comes closer.
Of course as long as the conspicuous philanthropy is not the absolute most effective philanthropy for the signaler then you could in principle just scale up the signaling costs appropriately. But this introduces lots of extra problems, since by giving you are then mostly signaling that you like charity.
True, but EAs tend not to demand burning money as strongly as society as a whole does, and it won’t loose all it’s effective signaling power to non-EAs unless there’s a lot better flow of information than is typical in society.
For example, a fiancé′s parents might be mortally offended if their child didn’t give and/or receive the usual burnt offering in the form of costly signaling. However, I suspect that in most cases the charity diamond would be acceptable, even if the parents understood EA and knew that the counterfactual amount burnt was only 10% of the standard burnt offering.
After all, what they really care about is their son or daughter in law not embarrassing the family by looking cheap. So, unless all their friends also understand EA, they likely won’t care about whether the offering is donated or burnt, so long as there’s a diamond.
This might not work if the 90% was going to something besides charity, but luckily it’s socially costly to criticize good deeds. (Hence, why EAs have a hard time arguing against donations to curing rare diseases in cute puppies. You look like a dick if you criticize warm fuzzies.)
Some EA do have a fixed 10% commitment but many don’t. The Giving Pledge is 50%.