If you have perfect foresight and you know that action X is the only thing that will prevent the human race from going extinct, then maybe action X is justified. But none of those conditions apply.
That’s not true - we don’t make decisions based on perfect knowledge. If you believe the probability of doom is 1, or even not 1 but incredibly high, then any actions that prevent it or slow it down are worth pursuing—it’s a matter of expected value.
If you have perfect foresight and you know that action X is the only thing that will prevent the human race from going extinct, then maybe action X is justified. But none of those conditions apply.
That’s not true - we don’t make decisions based on perfect knowledge. If you believe the probability of doom is 1, or even not 1 but incredibly high, then any actions that prevent it or slow it down are worth pursuing—it’s a matter of expected value.