Yes in general this is a fairly esoteric question. I had a very specific reason for considering it however, which I’ll share with you.
What percentage of the 6.7 billion people on earth would it be moral to kill, say in a demonstration of the possibility of existential risk, in order to someday realize the eventual existence of 10^23 lives in the Virgo Supercluster?
If we consider the creation of new people with positive utility a moral imperative, it would seem that killing any number of today’s people, even over 6 billion, would be justified to even marginally increase the chances of creating a trillion year galactic civilization. This doens’t make sense to me, which is why I was looking into the issue.
If we consider the creation of new people with positive utility a moral imperative, it >would seem that killing any number of today’s people, even over 6 billion, would be >justified to even marginally increase the chances of creating a trillion year galactic >civilization. This doens’t make sense to me, which is why I was looking into the issue.
If you want to retain total utilitarianism but don’t want this result you can always do what economists do and apply discounting. The justification being that people seem to discount future utility somewhat relative to present utility and not discounting leads to perverse results. If you use a discount rate of say 2%* per year then the utility of 10^23 people in 2500 years is equal to the utility of around 32 people today (10^23/1.02^2500 = 31.59) . Of course, if you think that the trillion year galactic civilization is just around the corner, or that the people then will have much higher utility than current people do that changes things somewhat.
*I picked that rate because I think it is about what was used in the Stern Review
Yes in general this is a fairly esoteric question. I had a very specific reason for considering it however, which I’ll share with you.
What percentage of the 6.7 billion people on earth would it be moral to kill, say in a demonstration of the possibility of existential risk, in order to someday realize the eventual existence of 10^23 lives in the Virgo Supercluster?
If we consider the creation of new people with positive utility a moral imperative, it would seem that killing any number of today’s people, even over 6 billion, would be justified to even marginally increase the chances of creating a trillion year galactic civilization. This doens’t make sense to me, which is why I was looking into the issue.
If you want to retain total utilitarianism but don’t want this result you can always do what economists do and apply discounting. The justification being that people seem to discount future utility somewhat relative to present utility and not discounting leads to perverse results. If you use a discount rate of say 2%* per year then the utility of 10^23 people in 2500 years is equal to the utility of around 32 people today (10^23/1.02^2500 = 31.59) . Of course, if you think that the trillion year galactic civilization is just around the corner, or that the people then will have much higher utility than current people do that changes things somewhat.
*I picked that rate because I think it is about what was used in the Stern Review
Ah. These problems go away if you accept that humanity is stuck on Earth and doomed. Or if you aren’t a utilitarian.