If emotions are necessary but not sufficient for forming goals among humans, the claim might be that rationality has no normative value to humans without goals without addressing rationality’s normative value to humans with emotions who don’t have goals.
If you see them as equivalent, this implies that you believe emotions are necessary and sufficient for forming goals among humans.
As much as this might be true for humans, it would be strange to say that after goals are formed, the loss of emotion in a person would obviate all their already formed non-emotional goals. So it’s not just that you’re discussing the human case and not the AI case, you’re discussing the typical human.
If emotions are necessary but not sufficient for forming goals among humans, the claim might be that rationality has no normative value to humans without goals without addressing rationality’s normative value to humans with emotions who don’t have goals.
If you see them as equivalent, this implies that you believe emotions are necessary and sufficient for forming goals among humans.
As much as this might be true for humans, it would be strange to say that after goals are formed, the loss of emotion in a person would obviate all their already formed non-emotional goals. So it’s not just that you’re discussing the human case and not the AI case, you’re discussing the typical human.
.