Longtermism question: has anyone ever proposed a discount rate on the moral value of future lives? By analogy to discount rates used in finance and investing.
This could account for the uncertainty in predicting the existence of future people. Or serve as a compromise between views like neartermism and longtermism, or pro-natalism and anti-natalism.
Yes, this is an argument people have made. Longtermists tend to reject it. First off, applying a discount rate on the moral value of lives in order to account for the uncertainty of the future is...not a good idea. These two things are totally different, and shouldn’t be conflated like that imo. If you want to apply a discount rate to account for the uncertainty of the future, just do that directly. So, for the rest of the post I’ll assume a discount rate on moral value actually applies to moral value.
So, that leaves us with the moral argument.
A fairly good argument, and the one I subscribe to, is this:
Let’s say we apply a conservative discount rate, say, 1% per year, to the moral value of future lives.
Given that, one life now is worth approximately 500 million lives two millenia from now. (0.99^2000 = approximately 2e-9)
But would that have been reasonably true in the past? Would it have been morally correct to save a life in 0 BC at the cost of 500 million lives today?
If the answer is “no” to that, it should also be considered “no” in the present.
This is, again, different from a discount rate on future lives based on uncertainty. It’s entirely reasonable to say “If there’s only a 50% chance this person ever exists, I should treat it as 50% as valuable.” I think that this is a position that wouldn’t be controversial among longtermists.
The kinds of discount rates that you see in finance will imply that moral value goes to about zero after like 1000 years, which we basically know couldn’t be true (total life tends to grow over time, not shrink). Discount rates are a pretty crude heuristic for estimating value over time and will be inapplicable to many situations.
Longtermism question: has anyone ever proposed a discount rate on the moral value of future lives? By analogy to discount rates used in finance and investing.
This could account for the uncertainty in predicting the existence of future people. Or serve as a compromise between views like neartermism and longtermism, or pro-natalism and anti-natalism.
Yes, this is an argument people have made. Longtermists tend to reject it. First off, applying a discount rate on the moral value of lives in order to account for the uncertainty of the future is...not a good idea. These two things are totally different, and shouldn’t be conflated like that imo. If you want to apply a discount rate to account for the uncertainty of the future, just do that directly. So, for the rest of the post I’ll assume a discount rate on moral value actually applies to moral value.
So, that leaves us with the moral argument.
A fairly good argument, and the one I subscribe to, is this:
Let’s say we apply a conservative discount rate, say, 1% per year, to the moral value of future lives.
Given that, one life now is worth approximately 500 million lives two millenia from now. (0.99^2000 = approximately 2e-9)
But would that have been reasonably true in the past? Would it have been morally correct to save a life in 0 BC at the cost of 500 million lives today?
If the answer is “no” to that, it should also be considered “no” in the present.
This is, again, different from a discount rate on future lives based on uncertainty. It’s entirely reasonable to say “If there’s only a 50% chance this person ever exists, I should treat it as 50% as valuable.” I think that this is a position that wouldn’t be controversial among longtermists.
Thanks!
The kinds of discount rates that you see in finance will imply that moral value goes to about zero after like 1000 years, which we basically know couldn’t be true (total life tends to grow over time, not shrink). Discount rates are a pretty crude heuristic for estimating value over time and will be inapplicable to many situations.