I personally think that something more akin to minimum utilitarianism is more inline with my intuitions. That is, to a first order approximation, define utility as (soft)min(U(a),U(b),U(c),U(d)...) where a,b,c,d… are the sentients in the universe. This utility function mostly captures my intuitions as long as we have reasonable control over everyone’s outcomes, utilities are comparable, and the number of people involved isn’t too crazy.
I personally think that something more akin to minimum utilitarianism is more inline with my intuitions. That is, to a first order approximation, define utility as (soft)min(U(a),U(b),U(c),U(d)...) where a,b,c,d… are the sentients in the universe. This utility function mostly captures my intuitions as long as we have reasonable control over everyone’s outcomes, utilities are comparable, and the number of people involved isn’t too crazy.