I am trying to get a solid grip on how micromorts work (quick intro for those not familiar with the concept at https://en.wikipedia.org/wiki/Micromort). I have been doing some calculations, and there is one result I am seeing that I haven’t been able to resolve or explain.
I attempted to reconstruct the “22 micromorts per day” value for deaths from all causes in the U.S. for 2010 (as listed on the Wikipedia page), and was able to do that, so I updated using 2019 data to get an estimate of 24 micromorts per day. I got this as follows: if you take the total number of deaths from 2019 in the U.S. (2,854,838) divided by the population of the U.S. in 2019 (328,239,523), that estimates the probability that someone dies in a year in the U.S. Divide by the number of days (365.25) to get a probability of death per day, and then multiply by 1 million to get 24 micromorts per day for deaths from all causes in the U.S. in 2019. So far, so good.
Then I decided to look at the number a different way: if the risk of death is 24 micromorts per day, we should be able to convert that to an estimated lifespan, because on average we get 1 million micromorts total per person (obviously some people get fewer and some get more, but by definition it has to average out). But if we take 1 million micromorts divided by 24 micromorts per day, we get 45,454 days, or 124 years. That is significantly longer than the 2018 CDC estimated human lifespan in the U.S. of 78.7 years (although at least it is within an order of magnitude).
Can anyone explain why there is a difference between the two estimates for human lifespan? I feel like there is a simple explanation lurking somewhere, but I haven’t been able to figure out what it is yet. Thanks for your help!
Micromorts vs. Life Expectancy
I am trying to get a solid grip on how micromorts work (quick intro for those not familiar with the concept at https://en.wikipedia.org/wiki/Micromort). I have been doing some calculations, and there is one result I am seeing that I haven’t been able to resolve or explain.
I attempted to reconstruct the “22 micromorts per day” value for deaths from all causes in the U.S. for 2010 (as listed on the Wikipedia page), and was able to do that, so I updated using 2019 data to get an estimate of 24 micromorts per day. I got this as follows: if you take the total number of deaths from 2019 in the U.S. (2,854,838) divided by the population of the U.S. in 2019 (328,239,523), that estimates the probability that someone dies in a year in the U.S. Divide by the number of days (365.25) to get a probability of death per day, and then multiply by 1 million to get 24 micromorts per day for deaths from all causes in the U.S. in 2019. So far, so good.
Then I decided to look at the number a different way: if the risk of death is 24 micromorts per day, we should be able to convert that to an estimated lifespan, because on average we get 1 million micromorts total per person (obviously some people get fewer and some get more, but by definition it has to average out). But if we take 1 million micromorts divided by 24 micromorts per day, we get 45,454 days, or 124 years. That is significantly longer than the 2018 CDC estimated human lifespan in the U.S. of 78.7 years (although at least it is within an order of magnitude).
Can anyone explain why there is a difference between the two estimates for human lifespan? I feel like there is a simple explanation lurking somewhere, but I haven’t been able to figure out what it is yet. Thanks for your help!