A rough distribution (on a log scale) based on the two points you estimated for wars (95% < 1B people die in wars, 85% < 10M people die in wars) gives a median of ~2,600 people dying. Does that seem right?
No. My model is the sum of a bunch of random variables for possible conflicts (these variables are not independent of each other), where there are a few potential global wars that would cause millions or billions of deaths, and lots and lots of tiny wars each of which would add a few thousand deaths.
This model predicts a background rate of the sum of the smaller ones, and large spikes to the rate whenever a larger conflict happens. Accordingly, over the last three decades (with the tragic exception of the Rwandan genocide) total war deaths per year (combatants + civilians) have been between 18k and 132k (wow, the Syrian Civil War has been way worse than the Iraq War, I didn’t realize that).
So my median is something like 1M people dying over the decade, because I view a major conflict as under 50% likely, and we could easily have a decade as peaceful (no, really) as the 2000s.
Yeah this seems pretty reasonable. It’s actually stark looking at the Our World in Data – that seems really high per year. Do you have your model somewhere? I’d be interested to see it.
It’s not explicit. Like I said, the terms are highly dependent in reality, but for intuition you can think of a series of variables Xk for k from 1 to N, where Xk equals 1/k with probability 1/√N. And think of N as pretty large.
So most of the time, the sum of these is dominated by a lot of terms with small contributions. But every now and then, a big one hits and there’s a huge spike.
(I haven’t thought very much about what functions of k and N I’d actually use if I were making a principled model; 1/k and 1/√N are just there for illustrative purposes, such that the sum is expected to have many small terms most of the time and some very large terms occasionally.)
A rough distribution (on a log scale) based on the two points you estimated for wars (95% < 1B people die in wars, 85% < 10M people die in wars) gives a median of ~2,600 people dying. Does that seem right?
No. My model is the sum of a bunch of random variables for possible conflicts (these variables are not independent of each other), where there are a few potential global wars that would cause millions or billions of deaths, and lots and lots of tiny wars each of which would add a few thousand deaths.
This model predicts a background rate of the sum of the smaller ones, and large spikes to the rate whenever a larger conflict happens. Accordingly, over the last three decades (with the tragic exception of the Rwandan genocide) total war deaths per year (combatants + civilians) have been between 18k and 132k (wow, the Syrian Civil War has been way worse than the Iraq War, I didn’t realize that).
So my median is something like 1M people dying over the decade, because I view a major conflict as under 50% likely, and we could easily have a decade as peaceful (no, really) as the 2000s.
Yeah this seems pretty reasonable. It’s actually stark looking at the Our World in Data – that seems really high per year. Do you have your model somewhere? I’d be interested to see it.
It’s not explicit. Like I said, the terms are highly dependent in reality, but for intuition you can think of a series of variables Xk for k from 1 to N, where Xk equals 1/k with probability 1/√N. And think of N as pretty large.
So most of the time, the sum of these is dominated by a lot of terms with small contributions. But every now and then, a big one hits and there’s a huge spike.
(I haven’t thought very much about what functions of k and N I’d actually use if I were making a principled model; 1/k and 1/√N are just there for illustrative purposes, such that the sum is expected to have many small terms most of the time and some very large terms occasionally.)