Isn’t that kinda the point? It suggests there’s probably something wrong with arguments of the form “such-and-such an improbable proposition about moral values would make a huge difference if correct, so we should all drop everything and attend to it”.
One thing that might be wrong: if moral values are not objective facts about the world but about particular people’s (or communities’) value systems, then it doesn’t make sense to ask “what’s the probability that every sperm is sacred?” or “what’s the probability that a foetus is about as important morally as an adult human?”; our values are what they are and it’s perfectly reasonable to have very little uncertainty about them. It remains reasonable to ask “what’s the probability that spermatozoa or foetuses have the properties that I do, in fact, regard as conferring moral significance?”—but that probability may reasonably be extremely low, e.g. on the grounds that spermatozoa don’t have brains.
It would dominate (if you buy into the EA maximise QALY assumptions) because the QALY lost would massively outweigh those lost to everything else.
If it were true, then I suppose one could freeze sperm so that a future space faring civilisation with room for a larger population can use them. But I’m not suggesting anyone actually bites this bullet.
What credence do you give to the proposition that every sperm is sacred?
Unless your credence is a lot less than one in a billion (which is dubious given overconfidence bias) then this dominates all other concerns
The easiest argument against this is to observe that sperm are not the limiting factor in creating more lives- uterus time and parenting time are.
This is a good point, but it applies to moral uncertainty in general, not just to this particular case
Isn’t that kinda the point? It suggests there’s probably something wrong with arguments of the form “such-and-such an improbable proposition about moral values would make a huge difference if correct, so we should all drop everything and attend to it”.
One thing that might be wrong: if moral values are not objective facts about the world but about particular people’s (or communities’) value systems, then it doesn’t make sense to ask “what’s the probability that every sperm is sacred?” or “what’s the probability that a foetus is about as important morally as an adult human?”; our values are what they are and it’s perfectly reasonable to have very little uncertainty about them. It remains reasonable to ask “what’s the probability that spermatozoa or foetuses have the properties that I do, in fact, regard as conferring moral significance?”—but that probability may reasonably be extremely low, e.g. on the grounds that spermatozoa don’t have brains.
How would it dominate? What would you do if it were true?
It would dominate (if you buy into the EA maximise QALY assumptions) because the QALY lost would massively outweigh those lost to everything else.
If it were true, then I suppose one could freeze sperm so that a future space faring civilisation with room for a larger population can use them. But I’m not suggesting anyone actually bites this bullet.