Are we comfortable saying that this a conflict between ethical altruism and ethical egoism?
I acknowledge the arguments are sound from the altruist perspective. If I argue them, my arguments will not be altruistic. Lets retable this discussion for elsewhere as ‘a convince me altruism is better’ discussion, without limiting the discussion to post secondary testing. There is a popular perspective that if you are rational, you will agree the altruism is the answer. I’m not convinced of that yet.
If altruism/egoism is too narrow, we can use wants-to-kill-Moloch versus Moloch-can’t-be-killed-so-make-your-sacrifice.
I’m comfortable with that. I don’t think rationality == altruism, but I do think if altruism is your preference than it’s irrational to not be altruistic, and I further think the typical human prefers to be altruistic even if they don’t realize it yet. I think altruistic humans are happier than non-altruistic ones, and the “warm fuzzy” variants of altruism cause happiness. (Cheating is like the anti warm fuzzy. It is a cold slimy.)
Like I said
Rationality is winning, but winning is having the world arranged according to your preferences and most people’s preferences include moral preferences.
Absent that last clause, you can get into a debate about when altruism is-and-is-not rational (and at that point we’re not talking about morality and we are talking about game theory, so we should stop using the word “altruism” and instead use “cooperation”), but since we’re all human beings here I implicitly took it as a terminal value. I agree that there can be rational minds that do not work that way.
Are we comfortable saying that this a conflict between ethical altruism and ethical egoism?
I acknowledge the arguments are sound from the altruist perspective. If I argue them, my arguments will not be altruistic. Lets retable this discussion for elsewhere as ‘a convince me altruism is better’ discussion, without limiting the discussion to post secondary testing. There is a popular perspective that if you are rational, you will agree the altruism is the answer. I’m not convinced of that yet.
If altruism/egoism is too narrow, we can use wants-to-kill-Moloch versus Moloch-can’t-be-killed-so-make-your-sacrifice.
I’m comfortable with that. I don’t think rationality == altruism, but I do think if altruism is your preference than it’s irrational to not be altruistic, and I further think the typical human prefers to be altruistic even if they don’t realize it yet. I think altruistic humans are happier than non-altruistic ones, and the “warm fuzzy” variants of altruism cause happiness. (Cheating is like the anti warm fuzzy. It is a cold slimy.)
Like I said
Absent that last clause, you can get into a debate about when altruism is-and-is-not rational (and at that point we’re not talking about morality and we are talking about game theory, so we should stop using the word “altruism” and instead use “cooperation”), but since we’re all human beings here I implicitly took it as a terminal value. I agree that there can be rational minds that do not work that way.