Uh, there are minds. I think you and I both agree on this. Not really sure what the “what if no one existed” thought experiment is supposed to gesture at. I am very happy that I exist and that I experience things. I agree that if I didn’t exist then I wouldn’t care about things
I think your method double counts the utility. In the absurd case, if I care about you and you care about me, and I care about you caring about me caring about you… then two people who like each other enough have infinite value. unless the repeating sum converges. How likely is the converging sum exactly right such that a selfish person should love all humans equally? Also even if it was balanced, if two well-connected socialites in latin america break up then this would significantly change the moral calculus for millions of people!
Being real for a moment, I think my friends (degree 1) are happier if I am friends with their friends (degree 2), want us to be at least on good terms, and would be sad if I fought with them. But my friends don’t care that much how I feel about the friends of their friends (degree 3)
Apologies if I gave the impression that “a selfish person should love all humans equally”; while I’m sympathetic to arguments from e.g. Parfit’s book Reasons and Persons[1], I don’t go anywhere that far. I was making a weaker and (I think) uncontroversial claim, something closer to Adam Smith’s invisible hand: that aggregating over every individual’s selfish focus on close family ties, overall results in moral concerns becoming relatively more spread out, because the close circles of your close circle aren’t exactly identical to your own.
Like that distances in time and space are similar. So if you imagine people in the distant past having the choice for a better life at their current time, in exchange for there being no people in the far future, then you wish they’d care about more than just their own present time. A similar logic argues against applying a very high discount rate to your moral concern for beings that are very distant to you in e.g. space, close ties, etc.
Uh, there are minds. I think you and I both agree on this. Not really sure what the “what if no one existed” thought experiment is supposed to gesture at. I am very happy that I exist and that I experience things. I agree that if I didn’t exist then I wouldn’t care about things
I think your method double counts the utility. In the absurd case, if I care about you and you care about me, and I care about you caring about me caring about you… then two people who like each other enough have infinite value. unless the repeating sum converges. How likely is the converging sum exactly right such that a selfish person should love all humans equally? Also even if it was balanced, if two well-connected socialites in latin america break up then this would significantly change the moral calculus for millions of people!
Being real for a moment, I think my friends (degree 1) are happier if I am friends with their friends (degree 2), want us to be at least on good terms, and would be sad if I fought with them. But my friends don’t care that much how I feel about the friends of their friends (degree 3)
Apologies if I gave the impression that “a selfish person should love all humans equally”; while I’m sympathetic to arguments from e.g. Parfit’s book Reasons and Persons[1], I don’t go anywhere that far. I was making a weaker and (I think) uncontroversial claim, something closer to Adam Smith’s invisible hand: that aggregating over every individual’s selfish focus on close family ties, overall results in moral concerns becoming relatively more spread out, because the close circles of your close circle aren’t exactly identical to your own.
Like that distances in time and space are similar. So if you imagine people in the distant past having the choice for a better life at their current time, in exchange for there being no people in the far future, then you wish they’d care about more than just their own present time. A similar logic argues against applying a very high discount rate to your moral concern for beings that are very distant to you in e.g. space, close ties, etc.