I tend to agree, in that I also have a steep discount across time and distance (though I tend to think of it as “empathetic distance”, more about perceived self-similarity than measurable time or distance, and I tend to think of weightings in my utility function rather than using the term “moral responsibility”).
That said, it’s worth asking just how steep a discount is justifiable—WHY do you think you’re more responsible to a neighbor than to four of her great-grandchildren, and do you think this is the correct discount to apply?
And even if you do think it’s correct, remember to shut up and multiply. It’s quite possible for there to be more than 35x as much sentience in 10 generations as there is today.
I tend to agree, in that I also have a steep discount across time and distance (though I tend to think of it as “empathetic distance”, more about perceived self-similarity than measurable time or distance, and I tend to think of weightings in my utility function rather than using the term “moral responsibility”).
That said, it’s worth asking just how steep a discount is justifiable—WHY do you think you’re more responsible to a neighbor than to four of her great-grandchildren, and do you think this is the correct discount to apply?
And even if you do think it’s correct, remember to shut up and multiply. It’s quite possible for there to be more than 35x as much sentience in 10 generations as there is today.