The trouble is that there is nothing in epistemic rationality that corresponds to “motivations” or “goals” or anything like that. Epistemic rationality can tell you that pushing a button will lead to puppies not being tortured, and not pushing it will lead to puppies being tortured, but unless you have an additional system that incorporates desires for puppies to not be tortured, as well as a system for achieving those desires, that’s all you can do with epistemic rationality.
MinibearRex
Karma: 4,582
I don’t think EY actually suggests that people are doing those calculations. He’s saying that we’re just executing an adaptation that functioned well in groups of a hundred or so, but don’t work nearly as well anymore.