Ok, on gear level I have the following model: imagine that Moore’s law is true and the AI risk is proportional to the amount of available compute, via some unknown coefficient. In that model, after the cumulative risk reaches 50 per cent, 99 per cent is very near.
But we need some normalisation to escape probabilities above one, for example, by using odds ratio.
Also, cumulative risk is exponential even for constant probability density. If probability density is exponential, cumulative risk is double exponential.
It all means that if AI risk some sizeable digit, 10 per cent or 50 percent, there is only few years until almost certain end. In other words, there is no difference between two claims “10 per cent of AI in 2040” and “90 per cent of AI in 2040″, as both mean that the end will be in 2040s.
Ok, on gear level I have the following model: imagine that Moore’s law is true and the AI risk is proportional to the amount of available compute, via some unknown coefficient. In that model, after the cumulative risk reaches 50 per cent, 99 per cent is very near.
But we need some normalisation to escape probabilities above one, for example, by using odds ratio.
Also, cumulative risk is exponential even for constant probability density. If probability density is exponential, cumulative risk is double exponential.
It all means that if AI risk some sizeable digit, 10 per cent or 50 percent, there is only few years until almost certain end. In other words, there is no difference between two claims “10 per cent of AI in 2040” and “90 per cent of AI in 2040″, as both mean that the end will be in 2040s.