I’m comfortable with that. I don’t think rationality == altruism, but I do think if altruism is your preference than it’s irrational to not be altruistic, and I further think the typical human prefers to be altruistic even if they don’t realize it yet. I think altruistic humans are happier than non-altruistic ones, and the “warm fuzzy” variants of altruism cause happiness. (Cheating is like the anti warm fuzzy. It is a cold slimy.)
Like I said
Rationality is winning, but winning is having the world arranged according to your preferences and most people’s preferences include moral preferences.
Absent that last clause, you can get into a debate about when altruism is-and-is-not rational (and at that point we’re not talking about morality and we are talking about game theory, so we should stop using the word “altruism” and instead use “cooperation”), but since we’re all human beings here I implicitly took it as a terminal value. I agree that there can be rational minds that do not work that way.
I’m comfortable with that. I don’t think rationality == altruism, but I do think if altruism is your preference than it’s irrational to not be altruistic, and I further think the typical human prefers to be altruistic even if they don’t realize it yet. I think altruistic humans are happier than non-altruistic ones, and the “warm fuzzy” variants of altruism cause happiness. (Cheating is like the anti warm fuzzy. It is a cold slimy.)
Like I said
Absent that last clause, you can get into a debate about when altruism is-and-is-not rational (and at that point we’re not talking about morality and we are talking about game theory, so we should stop using the word “altruism” and instead use “cooperation”), but since we’re all human beings here I implicitly took it as a terminal value. I agree that there can be rational minds that do not work that way.