For now, the question I’m left with is—what OTHER existential risks are out there, how cost effective are they to fix, and do we have an adequate metric to judge our success?
The edited volume Global Catastrophic Risks addresses this question. It’s far more extensive than Nick Bostrom’s initial Existential Risks paper and provides a list of further reading after each chapter.
Here are some of the covered risks:
Astro-physical processes such as the stellar lifecycle
Human evolution
Super-volcanism
Comets and asteroids
Supernovae, gamma-ray bursts, solar flares, and cosmic rays
Climate change
Plagues and pandemics
Artificial Intelligence
Physics disasters
Social collapse
Nuclear war
Biotechnology
Nanotechnology
Totalitarianism
The book also has many chapters discussing the analysis of risk, risks and insurance, prophesies of doom in popular narratives, cognitive biases relating to risk, selection effects, and public policy.
Breakdown of the vacuum state, conversion of matter into strangelets, mini black-holes, and other things which people fear from a particle accelerator like the LHC. It boils down to, “Physics is weird, and we might find some way of killing ourselves by messing with it.”
It’s well-written, though depressing, if you take “only black holes will remain in 10^45 years” as depressing news.
Evolution is not a forward-looking algorithm, so humans could evolve in dangerous, retrograde ways, and thus extinct what we currently consider valuable about ourselves, or even the species itself should it become too dependent on current conditions.
The edited volume Global Catastrophic Risks addresses this question. It’s far more extensive than Nick Bostrom’s initial Existential Risks paper and provides a list of further reading after each chapter.
Here are some of the covered risks:
Astro-physical processes such as the stellar lifecycle
Human evolution
Super-volcanism
Comets and asteroids
Supernovae, gamma-ray bursts, solar flares, and cosmic rays
Climate change
Plagues and pandemics
Artificial Intelligence
Physics disasters
Social collapse
Nuclear war
Biotechnology
Nanotechnology
Totalitarianism
The book also has many chapters discussing the analysis of risk, risks and insurance, prophesies of doom in popular narratives, cognitive biases relating to risk, selection effects, and public policy.
What’s physics disasters?
Breakdown of the vacuum state, conversion of matter into strangelets, mini black-holes, and other things which people fear from a particle accelerator like the LHC. It boils down to, “Physics is weird, and we might find some way of killing ourselves by messing with it.”
What is the risk from Human Evolution? Maybe I should just buy the book...
It’s well-written, though depressing, if you take “only black holes will remain in 10^45 years” as depressing news.
Evolution is not a forward-looking algorithm, so humans could evolve in dangerous, retrograde ways, and thus extinct what we currently consider valuable about ourselves, or even the species itself should it become too dependent on current conditions.