I don’t know of a game-theoretic formalism allowing for agents to win something other than generic “dollars” or “points”, such that we can encode in the formalism that agents share some values but not others, and have tradeoffs among their different values.
I suspect this isn’t the main obstacle to reducing ethics to game theory. Once I’m willing to represent agents’ preferences with utility functions in the first place, I can operationalize “agents share some values” as some features of the world contributing positively to the utility functions of multiple agents, while an agent having “tradeoffs among their different values” is encoded in the same way as any other tradeoff they face between two things — as a ratio of marginal utilities arising from a marginal change in either of the two things.
Well yes, of course. It’s the “share some values but not others” that’s currently not formalized, as in current game-theory agents are (to my knowledge) only paid in “money”, denoted as a single scalar dimension measuring utility as a function of the agent’s experiences of game outcomes (rather than as a function of states of the game construed as an external world the agent cares about).
I suspect this isn’t the main obstacle to reducing ethics to game theory. Once I’m willing to represent agents’ preferences with utility functions in the first place, I can operationalize “agents share some values” as some features of the world contributing positively to the utility functions of multiple agents, while an agent having “tradeoffs among their different values” is encoded in the same way as any other tradeoff they face between two things — as a ratio of marginal utilities arising from a marginal change in either of the two things.
Well yes, of course. It’s the “share some values but not others” that’s currently not formalized, as in current game-theory agents are (to my knowledge) only paid in “money”, denoted as a single scalar dimension measuring utility as a function of the agent’s experiences of game outcomes (rather than as a function of states of the game construed as an external world the agent cares about).
So yeah.