I would add
Conflict theory vs. comparative advantage
Is it possible for the wrong kind of technological development to make things worse, or does anything that increases aggregate productivity always make everyone better off in the long run?
Cosmopolitanism vs. human protectionism
Is it acceptable, or good, to let humans go extinct if they will be replaced by an entity that’s more sophisticated or advanced in some way, or should humans defend humanity simply because we’re human?
The mammogram problem is different because you’re only trying to determine whether a specific woman has cancer, not whether cancer exists at all as a phenomenon. If Bob was abducted by aliens, it implies that alien abduction is real, but the converse isn’t true. You either need to do two separate Bayesian updates (what’s the probability that Bob was abducted given his experience, and then what’s the probability of aliens given the new probability that Bob was abducted), or you need a joint distribution covering all possibilities (Bob not abducted, aliens not real; Bob not abducted, aliens real; Bob abducted, aliens real).