This is a map of possible risks, not a map of claims. All it says is that if pure fusion (or other simple nukes) will be created it will make situation with proliferation much more difficult. For example laser enrichment is much simpler than traditional and it was recognised as proliferation risk. We can’t say how, but tech progress is making nukes cheaper and simpler and it is a problem. https://en.wikipedia.org/wiki/Separation_of_isotopes_by_laser_excitation
It can kill anyone, but not everyone. The world have around 5 million villages and small towns, and you need at least one bomb for each one to kill. On the peak of cold war the world had less than 100 000 bombs. If you really want to kill everyone, you should try something special like artificial nuclear winter or summer.
I realize that it’s a map of risks, I’m just saying the possibilities don’t even remotely fall into comparable levels of risk. “Death from nuclear ICBM” is quite imaginable and possible. Not only that, there was a time when it almost seemed imminent and inevitable. And it could easily become that way again. Whereas “death from cold fusion” is essentially of zero meaningful concern.
Maybe it would be useful if you could attach some kind of crude probabilities to your estimates. I can fill a pdf with items like “death from massive leprechaun attack” but it wouldn’t be a very useful guide.
While I do not appreciate your wording “death from cold fusion” when we speak about risks of proliferation connected with new technologies, I already added some kind of probability estimation to the map and painted boxes in one of three colors. But instead of probability I used “Importance of risks”, which more clearly connected with what we should do to prevent them.
“Importance (or urgency) of risks is subjectively estimated based on their probability, timing, magnitude of expected effect and scientific basis for the risk. Importance here means how much attention and efforts we should put to control the risk.
Green – just keep it in mind, do nothing
Yellow – pay attention, do reasonable efforts to prevent
Red – pay immediate attention to prevent”
The pdf is here: http://immortality-roadmap.com/nukerisk2.pdf
In it only two risks are red: nuclear war and nuclear-biological war.
The risks of large scale proliferation connected with new technologies is yellow.
This is a map of possible risks, not a map of claims. All it says is that if pure fusion (or other simple nukes) will be created it will make situation with proliferation much more difficult. For example laser enrichment is much simpler than traditional and it was recognised as proliferation risk. We can’t say how, but tech progress is making nukes cheaper and simpler and it is a problem. https://en.wikipedia.org/wiki/Separation_of_isotopes_by_laser_excitation
It can kill anyone, but not everyone. The world have around 5 million villages and small towns, and you need at least one bomb for each one to kill. On the peak of cold war the world had less than 100 000 bombs. If you really want to kill everyone, you should try something special like artificial nuclear winter or summer.
I realize that it’s a map of risks, I’m just saying the possibilities don’t even remotely fall into comparable levels of risk. “Death from nuclear ICBM” is quite imaginable and possible. Not only that, there was a time when it almost seemed imminent and inevitable. And it could easily become that way again. Whereas “death from cold fusion” is essentially of zero meaningful concern.
Maybe it would be useful if you could attach some kind of crude probabilities to your estimates. I can fill a pdf with items like “death from massive leprechaun attack” but it wouldn’t be a very useful guide.
While I do not appreciate your wording “death from cold fusion” when we speak about risks of proliferation connected with new technologies, I already added some kind of probability estimation to the map and painted boxes in one of three colors. But instead of probability I used “Importance of risks”, which more clearly connected with what we should do to prevent them.
“Importance (or urgency) of risks is subjectively estimated based on their probability, timing, magnitude of expected effect and scientific basis for the risk. Importance here means how much attention and efforts we should put to control the risk.
Green – just keep it in mind, do nothing Yellow – pay attention, do reasonable efforts to prevent Red – pay immediate attention to prevent” The pdf is here: http://immortality-roadmap.com/nukerisk2.pdf
In it only two risks are red: nuclear war and nuclear-biological war.
The risks of large scale proliferation connected with new technologies is yellow.
and the risk of Jupiter detonation is green.