If you don’t consider this particular one type of disutility (dust speck) convertible into the other (torture), the standard follow-up argument is to ask you to identify the smallest kind of disutility that might nonetheless be somehow convertible.
The typical list of examples include “a year’s unjust imprisonment”, “a broken leg”, “a splitting migraine”, “a diarrhea”, “an annoying hiccup”, “a papercut”, “a stubbed toe”.
Would any of these, if taking the place of the “dust speck”, change your position so that it’s now in favour of preferring to avert 3^^^3 repetitions of the lesser disutility, rather than avert the single person from being tortured?
E.g. is it better to save a single person from being tortured for 50 years, or better to save 3^^^3 people from suffering a year’s unjust imprisonment?
As you can see from both of my above comments it’s not the mathematical aspect that’s problematic. You choosing the word “disutility” means you’ve already accepted these units as convertible to a single currency.
In what manner would you prefer someone to decide such dilemmas? Arguing that the various sufferings might not be convertible at all is more of an additional problem, not a solution—not an algorithm that indicates how a person or an AI should decide.
I don’t expect that you think that an AI should explode in such a dilemma, nor that it should prefer to save neither potential torture victim nor potential dustspecked multitudes....
If you don’t consider this particular one type of disutility (dust speck) convertible into the other (torture), the standard follow-up argument is to ask you to identify the smallest kind of disutility that might nonetheless be somehow convertible.
The typical list of examples include “a year’s unjust imprisonment”, “a broken leg”, “a splitting migraine”, “a diarrhea”, “an annoying hiccup”, “a papercut”, “a stubbed toe”.
Would any of these, if taking the place of the “dust speck”, change your position so that it’s now in favour of preferring to avert 3^^^3 repetitions of the lesser disutility, rather than avert the single person from being tortured?
E.g. is it better to save a single person from being tortured for 50 years, or better to save 3^^^3 people from suffering a year’s unjust imprisonment?
As you can see from both of my above comments it’s not the mathematical aspect that’s problematic. You choosing the word “disutility” means you’ve already accepted these units as convertible to a single currency.
In what manner would you prefer someone to decide such dilemmas? Arguing that the various sufferings might not be convertible at all is more of an additional problem, not a solution—not an algorithm that indicates how a person or an AI should decide.
I don’t expect that you think that an AI should explode in such a dilemma, nor that it should prefer to save neither potential torture victim nor potential dustspecked multitudes....