As you can see from both of my above comments it’s not the mathematical aspect that’s problematic. You choosing the word “disutility” means you’ve already accepted these units as convertible to a single currency.
In what manner would you prefer someone to decide such dilemmas? Arguing that the various sufferings might not be convertible at all is more of an additional problem, not a solution—not an algorithm that indicates how a person or an AI should decide.
I don’t expect that you think that an AI should explode in such a dilemma, nor that it should prefer to save neither potential torture victim nor potential dustspecked multitudes....
As you can see from both of my above comments it’s not the mathematical aspect that’s problematic. You choosing the word “disutility” means you’ve already accepted these units as convertible to a single currency.
In what manner would you prefer someone to decide such dilemmas? Arguing that the various sufferings might not be convertible at all is more of an additional problem, not a solution—not an algorithm that indicates how a person or an AI should decide.
I don’t expect that you think that an AI should explode in such a dilemma, nor that it should prefer to save neither potential torture victim nor potential dustspecked multitudes....