As far as I can tell, this possibility of an exponentially-paced intelligence explosion is the main argument for folks devoting time to worrying about super-intelligent AI now, even though current technology doesn’t give us anything even close. So in the rest of this post, I want to push a little bit on the claim that the feedback loop induced by a self-improving AI would lead to exponential growth, and see what assumptions underlie it.
I think few AI safety advocates believe this. It’s much more common to expect growth to be faster than exponential. As you point out, exponential growth is a knife-edge phenomenon.
As far as I can tell, very few people actually think that intelligence growth would exhibit an actual mathematical singularity
This is actually a pretty common view—not a literal singularity, but rapid technological acceleration until natural resource limitations (e.g. on total available solar energy and raw minerals) start binding. If you look at the history of technological progress, it looks a whole lot more like a hyperbola than like an exponential curve, so the hyperbolic growth forecast isn’t so insane. It’s the person arguing that growth rates are going to stop at 3% who is arguing against the bulk of historical precedent (and whose predecessors would have been wrong if they’d expected growth to stop at 0.3% or 0.03% or 0.003%...).
this seems instead to be a metaphor for exponential growth.
I think “singularity” usually either follows Vinge’s use (as the point beyond which you can’t predict what will happen, because the future is guided by actors smarter than you are) or as a reference to the dynamic that would produce a mathematical singularity if left unchecked.
I think few AI safety advocates believe this. It’s much more common to expect growth to be faster than exponential. As you point out, exponential growth is a knife-edge phenomenon.
This is actually a pretty common view—not a literal singularity, but rapid technological acceleration until natural resource limitations (e.g. on total available solar energy and raw minerals) start binding. If you look at the history of technological progress, it looks a whole lot more like a hyperbola than like an exponential curve, so the hyperbolic growth forecast isn’t so insane. It’s the person arguing that growth rates are going to stop at 3% who is arguing against the bulk of historical precedent (and whose predecessors would have been wrong if they’d expected growth to stop at 0.3% or 0.03% or 0.003%...).
I think “singularity” usually either follows Vinge’s use (as the point beyond which you can’t predict what will happen, because the future is guided by actors smarter than you are) or as a reference to the dynamic that would produce a mathematical singularity if left unchecked.