What do you mean by “probability is growing exponentially”? Cumulative probabilities cannot grow exponentially forever, since they are bounded by [0,1].
It seems salvageable if what you mean by “probability” is (dP/dt)/P, where P is cumulative probability.
Can you clarify what gears-level model leads to exponential growth in “probability”?
Even if you pick a gears-level model of AGI probability which resembles an exponential within the domain [0, 0.99], you should have a lot of uncertainty over the growth rate. Thus your model should be a mixture of exponentials with a fairly wide spread over hyperparameters, and it is highly unlikely that you would get 99% by 2040, given how little we know about AGI.
Ok, on gear level I have the following model: imagine that Moore’s law is true and the AI risk is proportional to the amount of available compute, via some unknown coefficient. In that model, after the cumulative risk reaches 50 per cent, 99 per cent is very near.
But we need some normalisation to escape probabilities above one, for example, by using odds ratio.
Also, cumulative risk is exponential even for constant probability density. If probability density is exponential, cumulative risk is double exponential.
It all means that if AI risk some sizeable digit, 10 per cent or 50 percent, there is only few years until almost certain end. In other words, there is no difference between two claims “10 per cent of AI in 2040” and “90 per cent of AI in 2040″, as both mean that the end will be in 2040s.
This seems very wrong to me:
What do you mean by “probability is growing exponentially”? Cumulative probabilities cannot grow exponentially forever, since they are bounded by [0,1].
It seems salvageable if what you mean by “probability” is (dP/dt)/P, where P is cumulative probability.
Can you clarify what gears-level model leads to exponential growth in “probability”?
Even if you pick a gears-level model of AGI probability which resembles an exponential within the domain [0, 0.99], you should have a lot of uncertainty over the growth rate. Thus your model should be a mixture of exponentials with a fairly wide spread over hyperparameters, and it is highly unlikely that you would get 99% by 2040, given how little we know about AGI.
Ok, on gear level I have the following model: imagine that Moore’s law is true and the AI risk is proportional to the amount of available compute, via some unknown coefficient. In that model, after the cumulative risk reaches 50 per cent, 99 per cent is very near.
But we need some normalisation to escape probabilities above one, for example, by using odds ratio.
Also, cumulative risk is exponential even for constant probability density. If probability density is exponential, cumulative risk is double exponential.
It all means that if AI risk some sizeable digit, 10 per cent or 50 percent, there is only few years until almost certain end. In other words, there is no difference between two claims “10 per cent of AI in 2040” and “90 per cent of AI in 2040″, as both mean that the end will be in 2040s.