I firmly agree that ignoring other catastrophes is a mistake, and that including more catastrophes in our estimates is necessary.
I don’t believe that independence is a valid assumption. Intuitively I expect shared causal impacts of one catastrophic risk on others, which causes me to suspect they will cluster. For example, consider the following chain:
Climate change results in a drought in Syria → rebellion in Syria → civil war in Syria → Russia, Iran, and the United States get involved in the Syrian conflict → the risk of nuclear catastrophe increases
I break this down like so:
climate change → military conflict
military conflict → nuclear risk
The early developments of computing were driven by military conflict; ENIAC was for artillery firing tables and nuclear weapon simulations. I expect military conflict to increase the risk of AGI by driving incentives for it similarly, so I also have:
military conflict → AGI risk
So just from the cases of climate change, nuclear risk, and AGI risk I have a causal relationship that looks like this:
climate change → nuclear risk & AGI risk
This isn’t enough for me to put sensible numbers on the problem, but it is enough for me to be suspicious of treating them as independent variables. So currently I cringe at questions of the first type, and also of the second type, but I haven’t developed any meaningful improvements.
That being said, this was helpful to me in thinking about the counterfactual case, and for some reason this is also the first time I have ever seen the idea of ‘cruxing on the wrong question’ pointed to, which is very interesting.
I firmly agree that ignoring other catastrophes is a mistake, and that including more catastrophes in our estimates is necessary.
I don’t believe that independence is a valid assumption. Intuitively I expect shared causal impacts of one catastrophic risk on others, which causes me to suspect they will cluster. For example, consider the following chain:
Climate change results in a drought in Syria → rebellion in Syria → civil war in Syria → Russia, Iran, and the United States get involved in the Syrian conflict → the risk of nuclear catastrophe increases
I break this down like so:
climate change → military conflict
military conflict → nuclear risk
The early developments of computing were driven by military conflict; ENIAC was for artillery firing tables and nuclear weapon simulations. I expect military conflict to increase the risk of AGI by driving incentives for it similarly, so I also have:
military conflict → AGI risk
So just from the cases of climate change, nuclear risk, and AGI risk I have a causal relationship that looks like this:
climate change → nuclear risk & AGI risk
This isn’t enough for me to put sensible numbers on the problem, but it is enough for me to be suspicious of treating them as independent variables. So currently I cringe at questions of the first type, and also of the second type, but I haven’t developed any meaningful improvements.
That being said, this was helpful to me in thinking about the counterfactual case, and for some reason this is also the first time I have ever seen the idea of ‘cruxing on the wrong question’ pointed to, which is very interesting.