I’ve read the paper, and while it mentions “intelligence explosion” a few times, they seem to be keeping that terminology taboo when it comes to the meat of the argument, which is what I think you were asking for.
Most of the material is phrased in terms of whether AIs will exhibit significantly more intelligence than human-based systems and whether human values will be preserved.
I think most people use “intelligence explosion” to mean something more specific than just exponential growth. But you’re right that we should try and learn what we can about how systems evolve from looking at the past.
I’ve read the paper, and while it mentions “intelligence explosion” a few times, they seem to be keeping that terminology taboo when it comes to the meat of the argument, which is what I think you were asking for.
Yes, this is only a cosmetic issue with the paper, really.
I think most people use “intelligence explosion” to mean something more specific than just exponential growth.
Sure: explosions do also have to wind up going rapidly to qualify as such.
I’ve read the paper, and while it mentions “intelligence explosion” a few times, they seem to be keeping that terminology taboo when it comes to the meat of the argument, which is what I think you were asking for.
Most of the material is phrased in terms of whether AIs will exhibit significantly more intelligence than human-based systems and whether human values will be preserved.
I think most people use “intelligence explosion” to mean something more specific than just exponential growth. But you’re right that we should try and learn what we can about how systems evolve from looking at the past.
Yes, this is only a cosmetic issue with the paper, really.
Sure: explosions do also have to wind up going rapidly to qualify as such.