How much less willing? Suppose A would give up only a million times more utility to save B and 10^100 other people than to save B. Would A, if informed of the existence of 10^100 people, really choose not to save B alone at the price of a cent? It seems to me that would have to be the case if scope insensitivity were to be rational. (This isn’t my true objection, which I’m not sure how to verbalize at the moment.)
How much less willing? Suppose A would give up only a million times more utility to save B and 10^100 other people than to save B. Would A, if informed of the existence of 10^100 people, really choose not to save B alone at the price of a cent? It seems to me that would have to be the case if scope insensitivity were to be rational. (This isn’t my true objection, which I’m not sure how to verbalize at the moment.)