I think this depends on whether one takes an egoistic or even person-affecting perspective (“how will current humans feel about this when this happens?”) or a welfare-maximising consequentialist perspective (“how does this look on the view from nowhere”): If one assumes welfare-maximised utility to be linear or near-linear in the number of galaxies controlled, the 0.00002% outcome is far far worse than the 20% outcome, even though I personally would still be happy with the former.
I think this depends on whether one takes an egoistic or even person-affecting perspective (“how will current humans feel about this when this happens?”) or a welfare-maximising consequentialist perspective (“how does this look on the view from nowhere”): If one assumes welfare-maximised utility to be linear or near-linear in the number of galaxies controlled, the 0.00002% outcome is far far worse than the 20% outcome, even though I personally would still be happy with the former.