Saddly I still find many, many people arguing that “altruism” or “selflessness” just can’t exist, that everyone is purely selfish, and that helping others is only done because either you accept them to pay back later on (IPD-like) or because it makes you feel good to do so.
I tried many arguments, from opportunity cost (yes, giving to charity may give a “warm fuzzy” as Eliezer says, but spending the same money in buying a video game, going to a concert or eating yummy food can easily give more happiness), to exceptional situations (an atheist (so you can’t invoke fear of hell) from the Resistance withstanding torture to not betray his friends), but they always manage to dodge the issue and find pseudo-arguments like “they still do it only because of fear of shame”.
So I ended up trying to imagine hypothetical situations like “Imagine aliens kidnap you, and offer you a choice between pressing Blue Button or Red Button. If you press Blue Button, you’ll forget anything about aliens, awake next day in perfect health, find a winning lottery ticket in your mailbox, but aliens will destroy the Earth once you’re dead from natural death. If you press Red Button, aliens will offer to Earth a cure for cancer, aids, … but they’ll torture you for months and then kill you. Do you say no human would ever press the Red Button ?” But I guess that even then they’ll say “but the shame felt while pressing the Blue Button will be too high”.
And anyway “escalating” the conflict at this point doesn’t feel a “clean” answer to me. So I’m still looking for a more clever way of making people understand that humans are more complicated and you can’t explain all of “altruism” by just guilt feelings and warm fuzzies.
...humans are more complicated and you can’t explain all of “altruism” by just guilt feelings and warm fuzzies.
Chimpanzees also engage in altruism, even interspecies altruism. Humans tend to go a step further by using moral language and formalizing their morality. But how much of it is done for the purpose of signaling and rationalization compared to the altruism we share with chimpanzees and other animals?
But how much of it is done for the purpose of signaling and rationalization
What do you mean by “purpose” in this context? A “purpose” is a property of an optimization process, so the answer will depend on which optimization process you’re talking about. Are you asking about evolution or our conscious thought process?
...but spending the same money in buying a video game, going to a concert or eating yummy food can easily give more happiness
Not to mention donating to less efficient charity, but higher warm fuzzy-generating.
“but the shame felt while pressing the Blue Button will be too high”
Don’t forget to add that they’ll immediately wipe your memory the moment you press the button.
I’d suggest bringing up time discounting. Favoring earlier you isn’t technically altruistic, but it’s still something other than caring about your total happiness. Also, addictions.
Saddly I still find many, many people arguing that “altruism” or “selflessness” just can’t exist, that everyone is purely selfish, and that helping others is only done because either you accept them to pay back later on (IPD-like) or because it makes you feel good to do so.
I tried many arguments, from opportunity cost (yes, giving to charity may give a “warm fuzzy” as Eliezer says, but spending the same money in buying a video game, going to a concert or eating yummy food can easily give more happiness), to exceptional situations (an atheist (so you can’t invoke fear of hell) from the Resistance withstanding torture to not betray his friends), but they always manage to dodge the issue and find pseudo-arguments like “they still do it only because of fear of shame”.
So I ended up trying to imagine hypothetical situations like “Imagine aliens kidnap you, and offer you a choice between pressing Blue Button or Red Button. If you press Blue Button, you’ll forget anything about aliens, awake next day in perfect health, find a winning lottery ticket in your mailbox, but aliens will destroy the Earth once you’re dead from natural death. If you press Red Button, aliens will offer to Earth a cure for cancer, aids, … but they’ll torture you for months and then kill you. Do you say no human would ever press the Red Button ?” But I guess that even then they’ll say “but the shame felt while pressing the Blue Button will be too high”.
And anyway “escalating” the conflict at this point doesn’t feel a “clean” answer to me. So I’m still looking for a more clever way of making people understand that humans are more complicated and you can’t explain all of “altruism” by just guilt feelings and warm fuzzies.
Chimpanzees also engage in altruism, even interspecies altruism. Humans tend to go a step further by using moral language and formalizing their morality. But how much of it is done for the purpose of signaling and rationalization compared to the altruism we share with chimpanzees and other animals?
What do you mean by “purpose” in this context? A “purpose” is a property of an optimization process, so the answer will depend on which optimization process you’re talking about. Are you asking about evolution or our conscious thought process?
Wasn’t there a different article about this?
Not to mention donating to less efficient charity, but higher warm fuzzy-generating.
Don’t forget to add that they’ll immediately wipe your memory the moment you press the button.
I’d suggest bringing up time discounting. Favoring earlier you isn’t technically altruistic, but it’s still something other than caring about your total happiness. Also, addictions.