One central problem is that people are constantly deluged with information about incipient crises. The Typical Person can not be expected to understand the difference in risk levels indicated by UFAI vs. bioterror vs. thermonuclear war vs global warming, and this is not even a disparagement of the Typical Person. These risks are just impossible to estimate.
But how can we deal with this multitude of potential disasters? Each disaster has some low probability of occurring, but because there are so many of them (swine flu, nuclear EMP attacks, grey goo, complexity… ) we are almost certainly doomed, unless we do something clever. Even if we take preventative measures sufficient to eliminate the risk of one problem (presumably at some enormous expense), we will just get smashed by the next one on the list.
Meta-strategy: find strategies that help defend against all sources of existential risk simultaneously. Candidates:
moon base
genetic engineering of humans to be smarter and more disease-resistant
generic civilizational upgrades, e.g. reducing traffic and improving the economy
simplification. There is no fundamental reason why complexity always has to increase. Almost everything can be simplified: the law, the economy, software.
One central problem is that people are constantly deluged with information about incipient crises. The Typical Person can not be expected to understand the difference in risk levels indicated by UFAI vs. bioterror vs. thermonuclear war vs global warming, and this is not even a disparagement of the Typical Person. These risks are just impossible to estimate.
But how can we deal with this multitude of potential disasters? Each disaster has some low probability of occurring, but because there are so many of them (swine flu, nuclear EMP attacks, grey goo, complexity… ) we are almost certainly doomed, unless we do something clever. Even if we take preventative measures sufficient to eliminate the risk of one problem (presumably at some enormous expense), we will just get smashed by the next one on the list.
Meta-strategy: find strategies that help defend against all sources of existential risk simultaneously. Candidates:
moon base
genetic engineering of humans to be smarter and more disease-resistant
generic civilizational upgrades, e.g. reducing traffic and improving the economy
simplification. There is no fundamental reason why complexity always has to increase. Almost everything can be simplified: the law, the economy, software.