I can’t speak for shminux, of course, but caring about humanity surviving and thriving while not caring about the suffering or lives of strangers doesn’t seem at all arbitrary or puzzling to me.
I mean, consider the impact on me if 1000 people I’ve never met or heard of die tomorrow, vs. the impact on me if humanity doesn’t survive. The latter seems incontestably and vastly greater to me… does it not seem that way to you?
It doesn’t seem at all arbitrary that I should care about something that affects me greatly more than something that affects me less. Does it seem that way to you?
I mean, consider the impact on me if 1000 people I’ve never met or heard of die tomorrow, vs. the impact on me if humanity doesn’t survive. The latter seems incontestably and vastly greater to me… does it not seem that way to you?
Yes, rereading it, I think I misinterpreted response 2 as saying it doesn’t matter whether a population of 1,000 people has a long future or a population of one googleplex [has an equally long future]. That is, that population scope doesn’t matter, just durability and surivival. I thought this defeated the usual Big Future argument.
But even so, his 5 turns it around: Practically all people in the Big Future will be strangers, and if it is only “nicer” if they don’t suffer (translation: their wellbeing doesn’t really matter), then in what way would the Big Future matter?
I care a lot about humanity’s future, but primarily because of its impact on the total amout of positive and negative conscious experiences that it will cause.
I can’t speak for shminux, of course, but caring about humanity surviving and thriving while not caring about the suffering or lives of strangers doesn’t seem at all arbitrary or puzzling to me.
I mean, consider the impact on me if 1000 people I’ve never met or heard of die tomorrow, vs. the impact on me if humanity doesn’t survive. The latter seems incontestably and vastly greater to me… does it not seem that way to you?
It doesn’t seem at all arbitrary that I should care about something that affects me greatly more than something that affects me less. Does it seem that way to you?
Yes, rereading it, I think I misinterpreted response 2 as saying it doesn’t matter whether a population of 1,000 people has a long future or a population of one googleplex [has an equally long future]. That is, that population scope doesn’t matter, just durability and surivival. I thought this defeated the usual Big Future argument.
But even so, his 5 turns it around: Practically all people in the Big Future will be strangers, and if it is only “nicer” if they don’t suffer (translation: their wellbeing doesn’t really matter), then in what way would the Big Future matter?
I care a lot about humanity’s future, but primarily because of its impact on the total amout of positive and negative conscious experiences that it will cause.