If you grant as I do that there exists a single net amount of utility shared among all conscious beings in the universe...
You might want to double check that one. Mathematically speaking, adding together the utility of two different agents is best thought of as a type error, for multiple reasons.
The first problem is invariance: utility is only defined up to affine transformation, i.e. multiplying it by a constant and adding another constant leaves the underlying preferences unchanged. But if we add the utilities, then the preferences expressed by the added-together utility function do depend on which constants we multiply the two original utility functions by. It’s analogous to adding together “12 meters” and “3 kilograms”.
The second problem is that a utility function is only defined within the context of a world model—i.e. we typically say that an agent prefers to maximize E[u(X)], with the expectation taken over some model. Problem is, which variables X even “exist” to be used as inputs depends on the model. Even two different agents in the same environment can have models with entirely different variables in them (i.e. different ontologies). Their utility functions are defined only in the context of their individual models; there’s not necessarily any way to say that their utility is over world-states, and the different models do not necessarily have any correspondence.
If adding utilities across agents has any meaning at all, there’s nothing in the math to suggest what that meaning would be, and indeed the math suggests pretty strongly that it is not meaningful.
Are you using ‘utility’ in the economic context, for which a utility function is purely ordinal? Perhaps I should have used a different word, but I’m referring to ‘net positive conscious mental states,’ which intuitively doesn’t seem to suffer from the same issues.
Yes, I was using it in the economic sense. If we say something like “net positive conscious mental states”, it’s still unclear what it would mean to add up such things. What would “positive conscious mental state” mean, in a sense which can be added across humans, without running into the same problems which come up for utility?
I don’t think it is operationalizable, but I fail to see why ‘net positive mental states’ isn’t a meaningful, real value. Maybe the units would be apple*minutes or something, where one unit is equivalent to the pleasure you get by eating an apple for one minute. It seems that this could in principle be calculated with full information about everyone’s conscious experience.
You might want to double check that one. Mathematically speaking, adding together the utility of two different agents is best thought of as a type error, for multiple reasons.
The first problem is invariance: utility is only defined up to affine transformation, i.e. multiplying it by a constant and adding another constant leaves the underlying preferences unchanged. But if we add the utilities, then the preferences expressed by the added-together utility function do depend on which constants we multiply the two original utility functions by. It’s analogous to adding together “12 meters” and “3 kilograms”.
The second problem is that a utility function is only defined within the context of a world model—i.e. we typically say that an agent prefers to maximize E[u(X)], with the expectation taken over some model. Problem is, which variables X even “exist” to be used as inputs depends on the model. Even two different agents in the same environment can have models with entirely different variables in them (i.e. different ontologies). Their utility functions are defined only in the context of their individual models; there’s not necessarily any way to say that their utility is over world-states, and the different models do not necessarily have any correspondence.
If adding utilities across agents has any meaning at all, there’s nothing in the math to suggest what that meaning would be, and indeed the math suggests pretty strongly that it is not meaningful.
Are you using ‘utility’ in the economic context, for which a utility function is purely ordinal? Perhaps I should have used a different word, but I’m referring to ‘net positive conscious mental states,’ which intuitively doesn’t seem to suffer from the same issues.
Yes, I was using it in the economic sense. If we say something like “net positive conscious mental states”, it’s still unclear what it would mean to add up such things. What would “positive conscious mental state” mean, in a sense which can be added across humans, without running into the same problems which come up for utility?
I don’t think it is operationalizable, but I fail to see why ‘net positive mental states’ isn’t a meaningful, real value. Maybe the units would be apple*minutes or something, where one unit is equivalent to the pleasure you get by eating an apple for one minute. It seems that this could in principle be calculated with full information about everyone’s conscious experience.