The probability of human survival is primarily driven by AI systems caring a small amount about humans (whether due to ECL, commonsense morality, complicated and messy values, acausal trade, or whatever—I find all of those plausible).
I haven’t thought deeply about this question, because a world where AI systems don’t care very much about humans seems pretty bad for humans in expectation. I do think it matters whether the probability we all literally die is 10% or 50% or 90%, but it doesn’t matter very much to my personal prioritization.
The probability of human survival is primarily driven by AI systems caring a small amount about humans (whether due to ECL, commonsense morality, complicated and messy values, acausal trade, or whatever—I find all of those plausible).
I haven’t thought deeply about this question, because a world where AI systems don’t care very much about humans seems pretty bad for humans in expectation. I do think it matters whether the probability we all literally die is 10% or 50% or 90%, but it doesn’t matter very much to my personal prioritization.