“The ultimate incentive to cooperate” seems like an apt decision-theoretic description of what evolution wrought.
It’s actually considerably more than just incentive to cooperate. Valuing the welfare/happiness of another above your own leads to many things other than game-theoretic cooperation.
It’s actually considerably more than just incentive to cooperate. Valuing the welfare/happiness of another above your own leads to many things other than game-theoretic cooperation.
I’d like to hear more of what you have to say about that.
Love (and consequences) is a very wide topic :-) Do you have anything particular in mind?