I’m not sure if I’m representative, but I don’t think this does anything to address my personal intuitive dislike of the repugnant conclusion.
I think I intuitively agree that the world with tons of less-happy people is indeed more valuable; this could be affirmed by e.g. supposing that both worlds already exist, but both are in danger of destruction, and I only have the resources to save one of them. I think I would save the huge number of people rather than the small number of people, even if the huge number of people are less happy. (I suspect a lot of people would do the same.)
My intuitive problem is with the idea that there’s a duty to createpeople who don’t already exist. I intuitively feel like nonexistent people shouldn’t be able to make demands of me, no matter how good an exchange rate they can get from my happiness to theirs. (And similarly, that nonexistent people shouldn’t be able to demand that I make some existing third party less happy, no matter how much it helps the nonexistent person.)
In fact, I feel like I have kind of a nexus of confused intuitions regarding
The difference between ethically-good and ethically-obligatory
The difference between it being good for Alice to give X to Bob, and it being good for Bob to take X from Alice without her consent
This seems like it might be some sort of collision between an axiology frame and a bargaining frame? Like there’s a difference between “which of these two states is better?” and “do you have the right to unilaterally move the world from the worse state to the better state?”
I’m not sure if I’m representative, but I don’t think this does anything to address my personal intuitive dislike of the repugnant conclusion.
I think I intuitively agree that the world with tons of less-happy people is indeed more valuable; this could be affirmed by e.g. supposing that both worlds already exist, but both are in danger of destruction, and I only have the resources to save one of them. I think I would save the huge number of people rather than the small number of people, even if the huge number of people are less happy. (I suspect a lot of people would do the same.)
My intuitive problem is with the idea that there’s a duty to create people who don’t already exist. I intuitively feel like nonexistent people shouldn’t be able to make demands of me, no matter how good an exchange rate they can get from my happiness to theirs. (And similarly, that nonexistent people shouldn’t be able to demand that I make some existing third party less happy, no matter how much it helps the nonexistent person.)
In fact, I feel like I have kind of a nexus of confused intuitions regarding
Ethics offsets
The difference between ethically-good and ethically-obligatory
The difference between it being good for Alice to give X to Bob, and it being good for Bob to take X from Alice without her consent
This seems like it might be some sort of collision between an axiology frame and a bargaining frame? Like there’s a difference between “which of these two states is better?” and “do you have the right to unilaterally move the world from the worse state to the better state?”