I wrote an article how we could use such data in order to estimate cumulative probability of the nuclear war up to now.
TL;DR: from other domains we know that frequency of close calls is around 100:1 to actual events. If approximate it on nuclear war and assume that there were much more near misses than we know, we could conclude that probability of nuclear war was very high and we live in improbable world there it didn’t happen.
Yesterday 27 October was Arkhipov day in memory of the man who prevented nuclear war. Today 28 October is Bordne and Bassett day in memory of Americans who prevented another near-war event. Bassett was the man who did most of the work of preventing launch based false attack code, and Bordne made the story public.
The history of the Cold War shows us that there were many occasions when the world stood on the brink of disaster. The most famous of them being the cases of Petrov , Arkhipov and the recently opened Bordne case in Okinawa
I know of over ten, but less than a hundred similar cases of varying degrees of reliability. Other global catastrophic risk near-misses are not nuclear, but biological such as the Ebola epidemic, swine flu, bird flu, AIDS, oncoviruses and the SV-40 vaccine.
The pertinent question is whether we have survived as a result of observational selection, or whether these cases are not statistically significant.
In the Cold War era, these types of situations were quite numerous, (such as the Cuban missile crisis). However, in each case, it is difficult to say if the near-miss was actually dangerous. In some cases, the probability of disaster is subjective, that is, according to participants it was large, whereas objectively it was small. Other near-misses could be a real danger, but not be seen by operators.
We can define near-miss of the first type as a case that meets the both following criteria:
a) safety rules have been violated
b) emergency measures were applied in order to avoid disaster (e.g. emergency breaking of a vehicle, refusal to launch nuclear missiles)
Near-miss can also be defined as an event which, according to some participants of the event, was very dangerous. Or, as an event, during which a number of factors (but not all) of a possible catastrophe coincided.
Another type of near-miss is the miraculous salvation. This is a situation whereby a disaster was averted by a miracle, that is, it had to happen, but it did not happen because of a happy coincidence of newly emerged circumstances (for example, a bullet stuck in the gun barrel). Obviously, in the case of miraculous salvation a chance catastrophe was much higher than in near-misses of the first type, on which we will now focus.
We may take the statistics of near-miss cases from other areas where a known correlation between the near-miss and actual event exists, for example, compare the statistics of near-misses and actual accidents with victims in transport.
Industrial research suggests that one crash accounts for 50-100 near-miss cases in different areas, and 10,000 human errors or violations of regulations. (“Gains from Getting Near Misses Reported” )
Another survey estimates 1 to 600 and another 1 to 300 and even 1 to 3000 (but in case of unplanned maintenance).
The spread of estimates from 100 to 3000 is due to the fact that we are considering different industries, and different criteria for evaluating a near-miss.
However, the average ratio of near-misses is in the hundreds, and so we can not conclude that the observed non-occurrence of nuclear war results from observational selection.
On the other hand, we can use a near-miss frequency to estimate the risk of a global catastrophe. We will use a lower estimate of 1 in 100 for the ratio of near-miss to real case, because the type of phenomena for which the level of near-miss is very high will dominate the probability landscape. (For example, if an epidemic is catastrophic in 1 to 1000 cases, and for nuclear disasters the ratio is 1 to 100, the near miss in the nuclear field will dominate).
During the Cold War there were several dozen near-misses, and several near-miss epidemics at the same time, this indicates that at the current level of technology we have about one such case a year, or perhaps more: If we analyze the press, several times a year there is some kind of situation which may lead to the global catastrophe: a threat of war between North and South Korea, an epidemic, a passage of an asteroid, a global crisis. And also many near-misses remain classified.
If the average level of safety in regard to global risks does not improve, the frequency of such cases suggests that a global catastrophe could happen in the next 50-100 years, which coincides with the estimates obtained by other means.
It is important to increase detailed reporting on such cases in the field of global risks, and learn how to make useful conclusions based on them. In addition, we need to reduce the level of near misses in the areas of global risk, by rationally and responsibly increasing the overall level of security measures.
What we could learn from the frequency of near-misses in the field of global risks (Happy Bassett-Bordne day!)
I wrote an article how we could use such data in order to estimate cumulative probability of the nuclear war up to now.
TL;DR: from other domains we know that frequency of close calls is around 100:1 to actual events. If approximate it on nuclear war and assume that there were much more near misses than we know, we could conclude that probability of nuclear war was very high and we live in improbable world there it didn’t happen.
Yesterday 27 October was Arkhipov day in memory of the man who prevented nuclear war. Today 28 October is Bordne and Bassett day in memory of Americans who prevented another near-war event. Bassett was the man who did most of the work of preventing launch based false attack code, and Bordne made the story public.
The history of the Cold War shows us that there were many occasions when the world stood on the brink of disaster. The most famous of them being the cases of Petrov , Arkhipov and the recently opened Bordne case in Okinawa
I know of over ten, but less than a hundred similar cases of varying degrees of reliability. Other global catastrophic risk near-misses are not nuclear, but biological such as the Ebola epidemic, swine flu, bird flu, AIDS, oncoviruses and the SV-40 vaccine.
The pertinent question is whether we have survived as a result of observational selection, or whether these cases are not statistically significant.
In the Cold War era, these types of situations were quite numerous, (such as the Cuban missile crisis). However, in each case, it is difficult to say if the near-miss was actually dangerous. In some cases, the probability of disaster is subjective, that is, according to participants it was large, whereas objectively it was small. Other near-misses could be a real danger, but not be seen by operators.
We can define near-miss of the first type as a case that meets the both following criteria:
a) safety rules have been violated
b) emergency measures were applied in order to avoid disaster (e.g. emergency breaking of a vehicle, refusal to launch nuclear missiles)
Near-miss can also be defined as an event which, according to some participants of the event, was very dangerous. Or, as an event, during which a number of factors (but not all) of a possible catastrophe coincided.
Another type of near-miss is the miraculous salvation. This is a situation whereby a disaster was averted by a miracle, that is, it had to happen, but it did not happen because of a happy coincidence of newly emerged circumstances (for example, a bullet stuck in the gun barrel). Obviously, in the case of miraculous salvation a chance catastrophe was much higher than in near-misses of the first type, on which we will now focus.
We may take the statistics of near-miss cases from other areas where a known correlation between the near-miss and actual event exists, for example, compare the statistics of near-misses and actual accidents with victims in transport.
Industrial research suggests that one crash accounts for 50-100 near-miss cases in different areas, and 10,000 human errors or violations of regulations. (“Gains from Getting Near Misses Reported” )
Another survey estimates 1 to 600 and another 1 to 300 and even 1 to 3000 (but in case of unplanned maintenance).
The spread of estimates from 100 to 3000 is due to the fact that we are considering different industries, and different criteria for evaluating a near-miss.
However, the average ratio of near-misses is in the hundreds, and so we can not conclude that the observed non-occurrence of nuclear war results from observational selection.
On the other hand, we can use a near-miss frequency to estimate the risk of a global catastrophe. We will use a lower estimate of 1 in 100 for the ratio of near-miss to real case, because the type of phenomena for which the level of near-miss is very high will dominate the probability landscape. (For example, if an epidemic is catastrophic in 1 to 1000 cases, and for nuclear disasters the ratio is 1 to 100, the near miss in the nuclear field will dominate).
During the Cold War there were several dozen near-misses, and several near-miss epidemics at the same time, this indicates that at the current level of technology we have about one such case a year, or perhaps more: If we analyze the press, several times a year there is some kind of situation which may lead to the global catastrophe: a threat of war between North and South Korea, an epidemic, a passage of an asteroid, a global crisis. And also many near-misses remain classified.
If the average level of safety in regard to global risks does not improve, the frequency of such cases suggests that a global catastrophe could happen in the next 50-100 years, which coincides with the estimates obtained by other means.
It is important to increase detailed reporting on such cases in the field of global risks, and learn how to make useful conclusions based on them. In addition, we need to reduce the level of near misses in the areas of global risk, by rationally and responsibly increasing the overall level of security measures.