The trouble is that there is nothing in epistemic rationality that corresponds to “motivations” or “goals” or anything like that. Epistemic rationality can tell you that pushing a button will lead to puppies not being tortured, and not pushing it will lead to puppies being tortured, but unless you have an additional system that incorporates desires for puppies to not be tortured, as well as a system for achieving those desires, that’s all you can do with epistemic rationality.
The trouble is that there is nothing in epistemic rationality that corresponds to “motivations” or “goals” or anything like that. Epistemic rationality can tell you that pushing a button will lead to puppies not being tortured, and not pushing it will lead to puppies being tortured, but unless you have an additional system that incorporates desires for puppies to not be tortured, as well as a system for achieving those desires, that’s all you can do with epistemic rationality.
That’s entirely compatible with my point.