My current hypothesis is that most of the purpose of evolving morality is signaling that you are predictably non-defecting enough to deal with.
Morality is also involved in punishment, signalling virtue, and manipulating the behaviour of others—so they stop doing the bad deeds that you don’t like.
Certainly. I think my central thesis is that morality is a set of cached answers to a really complicated game theory problem given initial conditions (e.g. you are in a small tribe; you are in a big city and poor; you are a comfortable Western suburbanite), some cached in your mind, some cached in your genes, so it’s unsurprising that using intelligence to extrapolate from the cached answers without keeping a close eye on the game theoretical considerations of whatever the actual problem you’re trying to solve is will lead to trouble.
Morality is also involved in punishment, signalling virtue, and manipulating the behaviour of others—so they stop doing the bad deeds that you don’t like.
Certainly. I think my central thesis is that morality is a set of cached answers to a really complicated game theory problem given initial conditions (e.g. you are in a small tribe; you are in a big city and poor; you are a comfortable Western suburbanite), some cached in your mind, some cached in your genes, so it’s unsurprising that using intelligence to extrapolate from the cached answers without keeping a close eye on the game theoretical considerations of whatever the actual problem you’re trying to solve is will lead to trouble.