>It is even possible (in fact, due to resource constraints, it is likely) that they’re at odds with one another.
They’re almost certainly extremely at odds with each other. Saving humanity from destroying itself points in the other direction from reducing suffering, not by 180 degrees, but at a very sharp angle. This is not just because of resource constraints, but even more so because humanity is a species of torturers and it will try to spread life to places where it doesn’t naturally occur. And that life obviously will contain large amounts of suffering. People don’t like hearing that, especially in the x-risk reduction demographic, but it’s pretty clear the goals are at odds.
Since I’m a non-altruist, there’s not really any reason to care about most of that future suffering (assuming I’ll be dead by then), but there’s not really any reason to care about saving humanity from extinction, either.
There are some reasons why the angle is not a full 180 degrees: There might be aliens who would also cause suffering and humanity might compete with them for resources, humanity might wipe itself out in ways that also cause suffering such as AGI, or there might be a practical correlations between political philosophies that cause high-suffering and also high-extinction-probability, e.g. torturers are less likely to care about humanity’s survival. But none of these make the goals point in the same direction.
>It is even possible (in fact, due to resource constraints, it is likely) that they’re at odds with one another.
They’re almost certainly extremely at odds with each other. Saving humanity from destroying itself points in the other direction from reducing suffering, not by 180 degrees, but at a very sharp angle. This is not just because of resource constraints, but even more so because humanity is a species of torturers and it will try to spread life to places where it doesn’t naturally occur. And that life obviously will contain large amounts of suffering. People don’t like hearing that, especially in the x-risk reduction demographic, but it’s pretty clear the goals are at odds.
Since I’m a non-altruist, there’s not really any reason to care about most of that future suffering (assuming I’ll be dead by then), but there’s not really any reason to care about saving humanity from extinction, either.
There are some reasons why the angle is not a full 180 degrees: There might be aliens who would also cause suffering and humanity might compete with them for resources, humanity might wipe itself out in ways that also cause suffering such as AGI, or there might be a practical correlations between political philosophies that cause high-suffering and also high-extinction-probability, e.g. torturers are less likely to care about humanity’s survival. But none of these make the goals point in the same direction.