Suppose the reader has a well defined utility function where death of torture are set to minus infinity. Then the writer can’t persuade them to trade off death or torture against any finite amount of utility. So, in what sense is the reader wrong about their own preferences?
I think the original Bomb scenario should have come with a, say, $1,000,000 value for “not being blown up”. That would have allowed for easy and agreed-upon expected utility calculations.
I think the original Bomb scenario should have come with a, say, $1,000,000 value for “not being blown up”. That would have allowed for easy and agreed-upon expected utility calculations.