Meta: I honestly didn’t read the plan in full the first two times I posted. Instead I went to Wikipedia and looked up global catastrophic risk. Then once I had an understanding of what the definition of global catastrophic risk is; I thought up solutions (How would I best solve X) and checked if they were on the map.
The reason why I share this is because the first several things I thought of were not on the map. And it seems like several other answers are limited to “whats outside the box” (Think outside the box is a silly concept because it often involves people telling you exactly where the box is and where to think outside of) and indicated by things near the existing map. I am not sure that you are getting great improvements to the map from the way you have set the problem.
New idea: If I were hosting the map there would be a selection of known x-risk problems. Something like:
AI:
paperclippers
UFAI
Oppressive AI (modifies our quality of life)
trickster AI (AI built with limits, i.e. human happiness—redefines its own reference term for human and happiness and kills all old humans that are not happy)
Nanotechnology:
2nd gen Molecular assembly that escapes containment (on purpose or by accident)
Race to profit includes a game of chicken to take the highest risk.
Biotechnology:
new disease with no known relationship to existing diseases and high virulence (difficult to cure)
new strain of old disease (known effect and a race to fight it off)
Nuclear:
catastrophic death of all life by nuclear war and ongoing radiation
reduction of lifespan due to radioactivity induced cancer. (possibly reducing us back to pre-colonial civilisation)
concerning speed of mutation due to nuclear particles (either in humans or in things that harm us or ensure our wellbeing, i.e. viruses, food supplies)
Global climate:
planet becomes uninhabitable to humans
planet becomes less habitable to humans—slows down growth of science/technology
humans are forced underground limiting the progress of scientific research or our ability to sustain food produce.
humans are cut off from each other and forced to live in small colonies
And what each of the solutions on the solution map might help solve.
Edits: formatting. I still can’t get the hang of formatting after this long!
Yes, the site is not finished, but the map “Typology of human extinction risks” is ready and will be published next week. Around 100 risks will be listed.
Any roadmap has its limitations because of its size and its basic 2D structure.
Of course we could and should cover all option for all risks but it should be done in more details. Maybe I should do a map there to each risks will be suggested ways to its prevention.
Meta: I honestly didn’t read the plan in full the first two times I posted. Instead I went to Wikipedia and looked up global catastrophic risk. Then once I had an understanding of what the definition of global catastrophic risk is; I thought up solutions (How would I best solve X) and checked if they were on the map.
The reason why I share this is because the first several things I thought of were not on the map. And it seems like several other answers are limited to “whats outside the box” (Think outside the box is a silly concept because it often involves people telling you exactly where the box is and where to think outside of) and indicated by things near the existing map. I am not sure that you are getting great improvements to the map from the way you have set the problem.
New idea: If I were hosting the map there would be a selection of known x-risk problems.
Something like:
AI:
paperclippers
UFAI
Oppressive AI (modifies our quality of life)
trickster AI (AI built with limits, i.e. human happiness—redefines its own reference term for human and happiness and kills all old humans that are not happy)
Nanotechnology:
2nd gen Molecular assembly that escapes containment (on purpose or by accident)
Race to profit includes a game of chicken to take the highest risk.
Biotechnology:
new disease with no known relationship to existing diseases and high virulence (difficult to cure)
new strain of old disease (known effect and a race to fight it off)
Nuclear:
catastrophic death of all life by nuclear war and ongoing radiation
reduction of lifespan due to radioactivity induced cancer. (possibly reducing us back to pre-colonial civilisation)
concerning speed of mutation due to nuclear particles (either in humans or in things that harm us or ensure our wellbeing, i.e. viruses, food supplies)
Global climate:
planet becomes uninhabitable to humans
planet becomes less habitable to humans—slows down growth of science/technology
humans are forced underground limiting the progress of scientific research or our ability to sustain food produce.
humans are cut off from each other and forced to live in small colonies
And what each of the solutions on the solution map might help solve.
Edits: formatting. I still can’t get the hang of formatting after this long!
Edit: it looks like you are working on other maps at http://immortality-roadmap.com/.
Yes, the site is not finished, but the map “Typology of human extinction risks” is ready and will be published next week. Around 100 risks will be listed. Any roadmap has its limitations because of its size and its basic 2D structure. Of course we could and should cover all option for all risks but it should be done in more details. Maybe I should do a map there to each risks will be suggested ways to its prevention.
I didn’t really know what x-risks you were talking about; which is why a map of x-risks would have helped me.
Basically the same risks you listed here. I can PM you the map.