I admit to being perplexed by this and some other pro-altruism posts on LW. If we’re trying to be rationalists, shouldn’t we come out and say: “I don’t often care about other’s suffering, especially of those people I don’t know personally, but I do try and signal that I care because this signaling benefits me. Sometimes this signaling benefits others too, which is nice”.
I agree everyone likely benefits from a society structured to reward altruism. We all might be in need of altruism one day. But there seems to be a disconnect between the prose of articles like this one and what I thought was the general rationalist belief that altruism in extended societies largely exists for signaling reasons.
Also, the benefits of altruism seem significantly less substantial when the targets are animals. Outside of personal experience animals are just unable to return any favors. If I save the lives of some children in Africa, I can hope those people contribute to the global economy and help make the world a better place for my children. Unfortunately the same cannot be said about my food.
I realize the article starts with the conditional statement “if one cares about suffering”, so my comments above aren’t really a critique. A more direct critique would be “who really cares about suffering?”. If we only care about signaling altruism then I think we should just come out and say that.
I like animals and have owned many pets, but I do not care about the suffering of animals far outside my personal experience. If I was surrounded by people who cared about such things then I likely would learn to as well; to do otherwise would signal barbarism. I might also learn to care if I was interested in signaling moral superiority over my peers.
More accurately, we evolved to be altruistic for signalling reasons. However, we don’t really care why we evolved to be altruistic. We just care about others.
I don’t often care about other’s suffering, especially of those people I don’t know personally, but I do try and signal that I care because this signaling benefits me
Remember the evolutionary-cognitive boundary. “We have evolved behaviors whose function is to signal altruism rather than to genuinely cause altruistic behavior” is not the same thing as “we act kind-of-altruistically because consciously or unconsciously expect it to signal favorable things about us”.
If you realize that evolution has programmed you to do something for some purpose, then embracing that evolutionary goal is certainly one possibility. But you can also decide that you genuinely care about some other purpose, and use the knowledge about yourself to figure out how to put yourself in situations which promote the kind of purpose that you prefer. Maybe I know that status benefits cause me to more reliably take altruistic action than rational calculation about altruism does, so I seek to put myself in communities where rationally calculating the most altruistic course of action and then taking that action is high status. And I also try to make this more high status in general, so that even people who genuinely only care about the status benefits end up taking altruistic actions.
Note that choosing to embrace most kinds of selfishness is no less arbitrary and going against evolution’s goals than choosing to embrace altruism. What evolution really cares about is inclusive fitness: if you’re going by the “oh, this is what evolution really intended you to do” route, then for the sake of consistency you should say “oh, evolution really intended us to have lots of surviving offspring, so I should ensure that I make regular egg/sperm donations and also get as many women as possible pregnant / spend as much time as possible being pregnant myself”.
Most people don’t actually want that, no matter what evolution designed us to do. So they rather choose to act selfishly, or altruistically, or some mixture of the two, or in some way that doesn’t really map to the selfish/altruistic axis at all, and if that seems to go beyond the original evolutionary purposes of the cognitive modules which are pushing us in whatever direction we do end up caring about, then so what?
And of course, talking about the “purpose” or “intention” of evolution in the first place is anthropomorphism. Evolution doesn’t actually care about anything, and claims like “we don’t really care about altruism” are only shorthand for “we come equipped with cognitive modules which, when put in certain situations, push us to act in particular ways which—according to one kind of analysis—do not reliably correlate with the achievement of altruistic acts while more reliably correlating with achieving status; when put in different situations, the analysis may come out differently”. That’s a purely empirical fact about yourself, not one which says anything about what you should care about.
Thank you for the explanation. I was trying to play the devil’s advocate a bit and I didn’t think my comment would be well-received. I’m glad to have gotten a thoughtful reply.
Thinking about it some more, I was not meaning to anthropomorphize evolution, just point out homo-hypocritus. On any particular value of a person’s, we have:
What they tell people about it.
How they act on it.
How they feel about it.
I feel bad about a lot of suffering (mostly that closest to me, of course). However its not clear to me that what I feel is any more “me” than what I do or what I say.
Most everyone (except psychopaths) feels bad about suffering, and tells their friends the same, but they don’t do much about it unless its close to their personal experience. Evolution programmed us to be hypocritical. However in this context its not clear to me why we’d chose to act on our feelings instead of feel like our actions (stop caring about distant non-cute animals), or why we’d chose to stop being hypocritical at all. We have lots of examples throughout history of large groups of people ceasing to care about suffering of certain groups, often due to social pressures. I think the tide can swing both ways here.
So I have trouble seeing how these movements would work without social pressures and appeals to self-interest. I guess there’s already a lot of pro-altruism social pressure on LW?
Edit: as a personal example, I feel more altruistic than I act, and act more altruistic than I let on to others. I do this because I’ve only gotten disutility from being seen as a nice guy, and have refrained from a lot of overt altruism because of this. I think I’d need a change in micro-culture to change my behavior here; appeals to logic aren’t going to sway me.
“Only” was a gross exaggeration. I’m not sure why I typed it.
I think my examples are pretty typical though. Charitable people get lobbied by people who want charity. This occurs with both personal and extended charity. In my case it gets me bugged into spending more time on other people’s technical problems (e.g. open-source software projects) than I’d like.
I haven’t contributed to many charities, but the ones I have seem to have put me on mailing and call lists. I also once contributed to a political candidate for his anti-war stance, and have been rewarded with political spam ever since. I’m not into politics at all so its rather unwelcome.
Most everyone (except psychopaths) feels bad about suffering, and tells their friends the same, but they don’t do much about it unless its close to their personal experience.
I’m not sure how much truth there is in this generalisation. Countless environmental activists, conservationists and humanitarian workers across the globe willingly give their time and energy to causes that have little or nothing to do with satisfying their own local needs or wants. Whilst they may not be in the majority, there are nevertheless a significant minority. I doubt many of them would be happy to be told they are only ‘signalling altruism’ to appear better in the eyes of their peers.
On the other hand, I suppose you could argue the case that such people have X-altruistic personalities and that perhaps that isn’t a desirable quality in terms of creating a hypothetical perfect society.
Sometimes, pleas for altruism are exactly what they seem. Not everything is a covert attempt at signaling. Trying to say that altruism is not serving self-interested reasons is kind of missing the point.
I admit to being perplexed by this and some other pro-altruism posts on LW. If we’re trying to be rationalists, shouldn’t we come out and say: “I don’t often care about other’s suffering, especially of those people I don’t know personally, but I do try and signal that I care because this signaling benefits me. Sometimes this signaling benefits others too, which is nice”.
I agree everyone likely benefits from a society structured to reward altruism. We all might be in need of altruism one day. But there seems to be a disconnect between the prose of articles like this one and what I thought was the general rationalist belief that altruism in extended societies largely exists for signaling reasons.
Also, the benefits of altruism seem significantly less substantial when the targets are animals. Outside of personal experience animals are just unable to return any favors. If I save the lives of some children in Africa, I can hope those people contribute to the global economy and help make the world a better place for my children. Unfortunately the same cannot be said about my food.
I realize the article starts with the conditional statement “if one cares about suffering”, so my comments above aren’t really a critique. A more direct critique would be “who really cares about suffering?”. If we only care about signaling altruism then I think we should just come out and say that.
I like animals and have owned many pets, but I do not care about the suffering of animals far outside my personal experience. If I was surrounded by people who cared about such things then I likely would learn to as well; to do otherwise would signal barbarism. I might also learn to care if I was interested in signaling moral superiority over my peers.
That’s, um, not a general rationalist belief.
More accurately, we evolved to be altruistic for signalling reasons. However, we don’t really care why we evolved to be altruistic. We just care about others.
Remember the evolutionary-cognitive boundary. “We have evolved behaviors whose function is to signal altruism rather than to genuinely cause altruistic behavior” is not the same thing as “we act kind-of-altruistically because consciously or unconsciously expect it to signal favorable things about us”.
If you realize that evolution has programmed you to do something for some purpose, then embracing that evolutionary goal is certainly one possibility. But you can also decide that you genuinely care about some other purpose, and use the knowledge about yourself to figure out how to put yourself in situations which promote the kind of purpose that you prefer. Maybe I know that status benefits cause me to more reliably take altruistic action than rational calculation about altruism does, so I seek to put myself in communities where rationally calculating the most altruistic course of action and then taking that action is high status. And I also try to make this more high status in general, so that even people who genuinely only care about the status benefits end up taking altruistic actions.
Note that choosing to embrace most kinds of selfishness is no less arbitrary and going against evolution’s goals than choosing to embrace altruism. What evolution really cares about is inclusive fitness: if you’re going by the “oh, this is what evolution really intended you to do” route, then for the sake of consistency you should say “oh, evolution really intended us to have lots of surviving offspring, so I should ensure that I make regular egg/sperm donations and also get as many women as possible pregnant / spend as much time as possible being pregnant myself”.
Most people don’t actually want that, no matter what evolution designed us to do. So they rather choose to act selfishly, or altruistically, or some mixture of the two, or in some way that doesn’t really map to the selfish/altruistic axis at all, and if that seems to go beyond the original evolutionary purposes of the cognitive modules which are pushing us in whatever direction we do end up caring about, then so what?
And of course, talking about the “purpose” or “intention” of evolution in the first place is anthropomorphism. Evolution doesn’t actually care about anything, and claims like “we don’t really care about altruism” are only shorthand for “we come equipped with cognitive modules which, when put in certain situations, push us to act in particular ways which—according to one kind of analysis—do not reliably correlate with the achievement of altruistic acts while more reliably correlating with achieving status; when put in different situations, the analysis may come out differently”. That’s a purely empirical fact about yourself, not one which says anything about what you should care about.
Thank you for the explanation. I was trying to play the devil’s advocate a bit and I didn’t think my comment would be well-received. I’m glad to have gotten a thoughtful reply.
Thinking about it some more, I was not meaning to anthropomorphize evolution, just point out homo-hypocritus. On any particular value of a person’s, we have:
What they tell people about it.
How they act on it.
How they feel about it.
I feel bad about a lot of suffering (mostly that closest to me, of course). However its not clear to me that what I feel is any more “me” than what I do or what I say.
Most everyone (except psychopaths) feels bad about suffering, and tells their friends the same, but they don’t do much about it unless its close to their personal experience. Evolution programmed us to be hypocritical. However in this context its not clear to me why we’d chose to act on our feelings instead of feel like our actions (stop caring about distant non-cute animals), or why we’d chose to stop being hypocritical at all. We have lots of examples throughout history of large groups of people ceasing to care about suffering of certain groups, often due to social pressures. I think the tide can swing both ways here.
So I have trouble seeing how these movements would work without social pressures and appeals to self-interest. I guess there’s already a lot of pro-altruism social pressure on LW?
Edit: as a personal example, I feel more altruistic than I act, and act more altruistic than I let on to others. I do this because I’ve only gotten disutility from being seen as a nice guy, and have refrained from a lot of overt altruism because of this. I think I’d need a change in micro-culture to change my behavior here; appeals to logic aren’t going to sway me.
Any examples?
“Only” was a gross exaggeration. I’m not sure why I typed it.
I think my examples are pretty typical though. Charitable people get lobbied by people who want charity. This occurs with both personal and extended charity. In my case it gets me bugged into spending more time on other people’s technical problems (e.g. open-source software projects) than I’d like.
I haven’t contributed to many charities, but the ones I have seem to have put me on mailing and call lists. I also once contributed to a political candidate for his anti-war stance, and have been rewarded with political spam ever since. I’m not into politics at all so its rather unwelcome.
I’m not sure how much truth there is in this generalisation. Countless environmental activists, conservationists and humanitarian workers across the globe willingly give their time and energy to causes that have little or nothing to do with satisfying their own local needs or wants. Whilst they may not be in the majority, there are nevertheless a significant minority. I doubt many of them would be happy to be told they are only ‘signalling altruism’ to appear better in the eyes of their peers.
On the other hand, I suppose you could argue the case that such people have X-altruistic personalities and that perhaps that isn’t a desirable quality in terms of creating a hypothetical perfect society.
Sometimes, pleas for altruism are exactly what they seem. Not everything is a covert attempt at signaling. Trying to say that altruism is not serving self-interested reasons is kind of missing the point.