After-the-fact analysis of the causes of major disasters often reveals multiple independent causes, none of which would have caused a disaster by itself, but each of which degraded or disabled the usual safeguards in place for the other problems. This seems to come up in everything from relatively small-scale transportation disasters to the fall of civilizations, and possibly in major extinction events. E.g. there have been many large asteroid impacts, but the one which finished off the dinosaurs happened to also coincide with (and possibly triggered or exacerbated) major volcanic activity. (The Deccan Traps.)
So the worst possible outcome of the epidemic might be that it happens to coincide with some other, totally unrelated disaster. For example, natural disasters such as earthquake+tsunamis, widespread rainfall and flooding, major fires piling air-quality issues on top of COVID-19 breathing problems, and so on. (In a way, I’m thankful the recent fires in Australia happened then, and are therefore not happening now.) Unrelated war(s) would make everything worse. So would a second pandemic at the same time. So would just about anything on the list of possible existential risks. I think this would count as a worst-case outcome of the epidemic, even though it would be an indirect outcome.
The global scale of this epidemic, and its months-long projected duration, seem to make it more probable that something else will go badly wrong just when everything else is under stress.
Also worth noting: if the onset of global catastrophes is better, then global catastrophes will tend to cluster together, so we might expect another global catastrophe before this one is over. (See the “clustering illusion.”)
I once took a look into the clustering illusion, and found a research that in the interconnected systems it is not an illusion: any correlation increases the probability of clustering significantly:
Downarowicz, T., & Lacroix, Y. (2011). The law of series. Ergodic Theory and Dynamical Systems, 31(2), 351–367.
After-the-fact analysis of the causes of major disasters often reveals multiple independent causes, none of which would have caused a disaster by itself, but each of which degraded or disabled the usual safeguards in place for the other problems. This seems to come up in everything from relatively small-scale transportation disasters to the fall of civilizations, and possibly in major extinction events. E.g. there have been many large asteroid impacts, but the one which finished off the dinosaurs happened to also coincide with (and possibly triggered or exacerbated) major volcanic activity. (The Deccan Traps.)
So the worst possible outcome of the epidemic might be that it happens to coincide with some other, totally unrelated disaster. For example, natural disasters such as earthquake+tsunamis, widespread rainfall and flooding, major fires piling air-quality issues on top of COVID-19 breathing problems, and so on. (In a way, I’m thankful the recent fires in Australia happened then, and are therefore not happening now.) Unrelated war(s) would make everything worse. So would a second pandemic at the same time. So would just about anything on the list of possible existential risks. I think this would count as a worst-case outcome of the epidemic, even though it would be an indirect outcome.
The global scale of this epidemic, and its months-long projected duration, seem to make it more probable that something else will go badly wrong just when everything else is under stress.
Yes, and there are biggest locust explosion in 70 years now.
Also worth noting: if the onset of global catastrophes is better, then global catastrophes will tend to cluster together, so we might expect another global catastrophe before this one is over. (See the “clustering illusion.”)
I once took a look into the clustering illusion, and found a research that in the interconnected systems it is not an illusion: any correlation increases the probability of clustering significantly:
Downarowicz, T., & Lacroix, Y. (2011). The law of series. Ergodic Theory and Dynamical Systems, 31(2), 351–367.