What do you mean by moral facts? It sounds in context like “ways to determine which values to give precedence to in the event of a conflict.” But such orders of precedence wouldn’t be facts, they’d be preferences. And if they’re preferences, why are you concerned that they might not exist?
Moral facts are true moral propositions, where we’d a consider a proposition to be a moral proposition if it makes a claim about something that normally fall under the field of ethics. It’s hard to be more precise without speculating about the nature of moral facts and picking up a particular system of ethics. For example, moral facts in an anti-realist, non-cognitivist sense may be something like reflexive relationships between moral agents (something like the golden rule) or even values an individual agents holds as true.
I agree we can talk about, say, an AGI having preferences for how it will resolve value conflicts, but I don’t think we can leave the issue there because the AGI’s preferences now take on the functional role of moral facts in its reasoning about conflicting human values. This does not mean the AGI is necessarily certain about these moral facts and may change its preferences so that the moral facts effectively change as it learns more.
Perhaps you’ve revealed some of the trouble, though, in that “moral facts” are too fuzzy a category for the sort of analysis I was attempting and my confusion results from the willingness to consider some things moral facts that are not and I would be better served by more carefully engaging with the terminology.
What do you mean by moral facts? It sounds in context like “ways to determine which values to give precedence to in the event of a conflict.” But such orders of precedence wouldn’t be facts, they’d be preferences. And if they’re preferences, why are you concerned that they might not exist?
Moral facts are true moral propositions, where we’d a consider a proposition to be a moral proposition if it makes a claim about something that normally fall under the field of ethics. It’s hard to be more precise without speculating about the nature of moral facts and picking up a particular system of ethics. For example, moral facts in an anti-realist, non-cognitivist sense may be something like reflexive relationships between moral agents (something like the golden rule) or even values an individual agents holds as true.
I agree we can talk about, say, an AGI having preferences for how it will resolve value conflicts, but I don’t think we can leave the issue there because the AGI’s preferences now take on the functional role of moral facts in its reasoning about conflicting human values. This does not mean the AGI is necessarily certain about these moral facts and may change its preferences so that the moral facts effectively change as it learns more.
Perhaps you’ve revealed some of the trouble, though, in that “moral facts” are too fuzzy a category for the sort of analysis I was attempting and my confusion results from the willingness to consider some things moral facts that are not and I would be better served by more carefully engaging with the terminology.