How would you classify existential risks within this framework? (or would you?)
Here’s my attempt. Any corrections or additions would be appreciated.
Transparent risks: asteroids (we roughly know the frequency?) Opaque risks: geomagnetic storms (we don’t know how resistant the electric grid is, although we have an idea of their frequency), natural physics disasters (such as vacuum decay), killed by an extraterrestrial civilization (could also fit black swans and adversarial environments depending on its nature) Knightian risks: - Black swans: ASI, nanotech, bioengineered pandemics, simulation shutdown (assuming it’s because of something we did) - Dynamic environment: “dysgenic” pressures (maybe also adversarial), natural pandemics (the world is getting more connected, medicine more robust, etc. which makes it difficult how the risks of natural pandemics are changing), nuclear holocaust (the game theoretic equilibrium changes as we get nuclear weapon that are faster and more precised, better detectors, etc.) - Adversarial environments: resource depletion or ecological destruction, misguided world government or another static social equilibrium that stops technological progress, repressive totalitarian global regime, take-over by a transcending upload (?), our potential or even our core values are eroded by evolutionary development (ex.: Hansonian em world)
Other (?): technological arrests (“The sheer technological difficulties in making the transition to the posthuman world might turn out to be so great that we never get there.” from https://nickbostrom.com/existential/risks.html )
How would you classify existential risks within this framework? (or would you?)
Here’s my attempt. Any corrections or additions would be appreciated.
Transparent risks: asteroids (we roughly know the frequency?)
Opaque risks: geomagnetic storms (we don’t know how resistant the electric grid is, although we have an idea of their frequency), natural physics disasters (such as vacuum decay), killed by an extraterrestrial civilization (could also fit black swans and adversarial environments depending on its nature)
Knightian risks:
- Black swans: ASI, nanotech, bioengineered pandemics, simulation shutdown (assuming it’s because of something we did)
- Dynamic environment: “dysgenic” pressures (maybe also adversarial), natural pandemics (the world is getting more connected, medicine more robust, etc. which makes it difficult how the risks of natural pandemics are changing), nuclear holocaust (the game theoretic equilibrium changes as we get nuclear weapon that are faster and more precised, better detectors, etc.)
- Adversarial environments: resource depletion or ecological destruction, misguided world government or another static social equilibrium that stops technological progress, repressive totalitarian global regime, take-over by a transcending upload (?), our potential or even our core values are eroded by evolutionary development (ex.: Hansonian em world)
Other (?): technological arrests (“The sheer technological difficulties in making the transition to the posthuman world might turn out to be so great that we never get there.” from https://nickbostrom.com/existential/risks.html )
This is great! I agree with most of these, and think it’s a useful exercise to do this classification.