Interesting. You can do similar calculations for things like asteroid prevention, I wonder which would win out. It also gave me a sickly feeling when you could use the vast numbers to justify killing a few people to guarantee a safe singularity. In effect that is what we do when we divert resources away from efficient charities we know work towards singularity research.
We make these kind of tradeoffs, for better or worse, all the time—and sometimes, as when we take money from the fire department or health care to put it into science or education, we are letting people die in the short term as part of a gamble that the long term outcomes will justify it.
I recommend Anna Salamon’s presentation How Much it Matters to Know What Matters: A Back of the Envelope Calculation. She did a good job of showing just how important existential risk research is.
I thought it would be nice to be able to plug in my own numbers for the calculation, so I quickly threw this together.
Interesting. You can do similar calculations for things like asteroid prevention, I wonder which would win out. It also gave me a sickly feeling when you could use the vast numbers to justify killing a few people to guarantee a safe singularity. In effect that is what we do when we divert resources away from efficient charities we know work towards singularity research.
We make these kind of tradeoffs, for better or worse, all the time—and sometimes, as when we take money from the fire department or health care to put it into science or education, we are letting people die in the short term as part of a gamble that the long term outcomes will justify it.