Wait, are you claiming that humans have moral intuitions because it maximizes global utility? Surely moral intuitions have been produced by evolution. Why would evolution select for agents with behaviour that maximize global utility?
Wait, are you claiming that humans have moral intuitions because it maximizes global utility? Surely moral intuitions have been produced by evolution.
No, I’m claiming that moral intuitions reflect the precomputation of higher-order strategic considerations (of the sort “if I let this person get away with stealing a bike, then I will be globally worse off even though I seem locally better off”).
I agree that you should expect evolution to create agents that maximize inclusive genetic fitness, which is quite different from global utility. But even if one adopts the frame that ‘utilitarian calculus is the standard of correctness,’ one can still use those moral intuitions as valuable cognitive guides, by directing attention towards considerations that might otherwise be missed.
Wait, are you claiming that humans have moral intuitions because it maximizes global utility? Surely moral intuitions have been produced by evolution. Why would evolution select for agents with behaviour that maximize global utility?
No, I’m claiming that moral intuitions reflect the precomputation of higher-order strategic considerations (of the sort “if I let this person get away with stealing a bike, then I will be globally worse off even though I seem locally better off”).
I agree that you should expect evolution to create agents that maximize inclusive genetic fitness, which is quite different from global utility. But even if one adopts the frame that ‘utilitarian calculus is the standard of correctness,’ one can still use those moral intuitions as valuable cognitive guides, by directing attention towards considerations that might otherwise be missed.