This post feels quite important from a global priorities standpoint. Nuclear war mitigation might have been one of the top priorities for humanity (and to be clear it’s still plausibly quite important). But given that the longtermist community has limited resources, it matters a lot whether something falls in the top 5-10 priorities.
A lot of people ask “Why is there so much focus on AI in the longtermist community? What about other x-risks like nuclear?”. And I think it’s an important, counterintuitive answer that nuclear war probably isn’t an x-risk.
Like Jeff and Bucky, I still think it’s worth someone following up and investigating the phenomenon here in more detail. It’s disappointing that humanity hasn’t studied this problem in as much depth as we could have.
This post feels quite important from a global priorities standpoint. Nuclear war mitigation might have been one of the top priorities for humanity (and to be clear it’s still plausibly quite important). But given that the longtermist community has limited resources, it matters a lot whether something falls in the top 5-10 priorities.
A lot of people ask “Why is there so much focus on AI in the longtermist community? What about other x-risks like nuclear?”. And I think it’s an important, counterintuitive answer that nuclear war probably isn’t an x-risk.
Like Jeff and Bucky, I still think it’s worth someone following up and investigating the phenomenon here in more detail. It’s disappointing that humanity hasn’t studied this problem in as much depth as we could have.