Moral intuitions are very simple. A general idea of what it means for somebody to be human is enough to severely restrict variety of moral intuitions which you would expect it to be possible for them to have. Thus, conditioned on Adam’s humanity, you would need very little additional information to get a good idea of Adam’s morals, while Bob the alien would need to explain his basic preferences at length for you to model his moral judgements accurately. It follows that the tricky part of explaining moral intuitions to a machine is explaining human, and it’s not possible to cheat by formalizing moral separately.
Irrationality game
Moral intuitions are very simple. A general idea of what it means for somebody to be human is enough to severely restrict variety of moral intuitions which you would expect it to be possible for them to have. Thus, conditioned on Adam’s humanity, you would need very little additional information to get a good idea of Adam’s morals, while Bob the alien would need to explain his basic preferences at length for you to model his moral judgements accurately. It follows that the tricky part of explaining moral intuitions to a machine is explaining human, and it’s not possible to cheat by formalizing moral separately.
Please attach a probability.
Fairly certain (85%—98%).
That is a very wide range. Downvoted you anyway.