One possible answer: Humans are selfish hypocrites. We try to pretend to have general moral rules because it is in our best interest to do so. We’ve even evolved to convince ourselves that we actually care about morality and not self-interest. That’s likely occurred because it is easier to make a claim one believes in than lie outright, so humans that are convinced that they really care about morality will do a better job acting like they do.
(This was listed by someone as one of the absolute deniables on the thread a while back about weird things an AI might tell people).
One possible answer: Humans are selfish hypocrites. We try to pretend to have general moral rules because it is in our best interest to do so. We’ve even evolved to convince ourselves that we actually care about morality and not self-interest. That’s likely occurred because it is easier to make a claim one believes in than lie outright, so humans that are convinced that they really care about morality will do a better job acting like they do.
(This was listed by someone as one of the absolute deniables on the thread a while back about weird things an AI might tell people).
Sounds like Robin Hanson’s Homo Hypocritus theory.