Use sufficient intelligent AI to find objective morality. If it exists, if it makes sense. It will have better understanding of it than us. Of course, if that sufficiently intelligent AI doesn’t kill us all prior.
Isn’t morality a human construct? Eliezer’s point is that morality defined by us, not an algorithm or a rule or something similar. If it was defined by something else, it wouldn’t be our morality.
He doesn’t have a proof that it is, because he doesn’t have an argument against the existence of objective morality, only an argument against its motivatingness.
If it was defined by something else, it wouldn’t be our morality.
And “our morality” wouldn’t be morality if it departs from the moral facts.
How would you define objective morality? What would make it objective? If it did exist, how would you possibly be able to find it?
There are various theories of moral realism , which you can find in various reference works.
Use sufficient intelligent AI to find objective morality. If it exists, if it makes sense. It will have better understanding of it than us. Of course, if that sufficiently intelligent AI doesn’t kill us all prior.
Isn’t morality a human construct? Eliezer’s point is that morality defined by us, not an algorithm or a rule or something similar. If it was defined by something else, it wouldn’t be our morality.
He doesn’t have a proof that it is, because he doesn’t have an argument against the existence of objective morality, only an argument against its motivatingness.
And “our morality” wouldn’t be morality if it departs from the moral facts.