I would like A Rationalists Guide to Personal Catastrophic Risks.
We like to think a lot about Global Catastrophic Risks (especially the EA folks), but there are smaller problems that are just a devastating to the individual.
How should we measure that? Dollars? QALYs? Micromorts? Should we use hyperbolic discounting? Do we expect to reach actuarial escape velocity (or be granted near-immortality after the Singularity) and how would that change the calculus?
Do anthropic effects matter to subjective survival? In the multiverse?
Consider also other catastrophes that don’t kill you, like losing a limb, or going blind, or more social risks like identity theft or getting scammed or robbed or sued, etc.
I would like A Rationalists Guide to Personal Catastrophic Risks.
We like to think a lot about Global Catastrophic Risks (especially the EA folks), but there are smaller problems that are just a devastating to the individual.
Should we wear helmets in cars? Should we wear covert body armor? Own a gun? Get a bug-out bag? An emergency cube? Learn wilderness survival?
And by how much should we be concerned about those “survivalist” topics vs less obvious longevity steps like flossing your teeth? Not everyone’s risk profile is the same. How do we assess that?
How should we measure that? Dollars? QALYs? Micromorts? Should we use hyperbolic discounting? Do we expect to reach actuarial escape velocity (or be granted near-immortality after the Singularity) and how would that change the calculus?
Do anthropic effects matter to subjective survival? In the multiverse?
Consider also other catastrophes that don’t kill you, like losing a limb, or going blind, or more social risks like identity theft or getting scammed or robbed or sued, etc.