It’s possible, but I think it would require a modified version of the “low ceiling conjecture” to be true.
The standard “low ceiling conjecture” says that human-level intelligence is the hard (or soft) limit, and therefore it will be impossible (or would take a very long period of time) to move from human-level AI to superintelligence. I think most of us tend not to believe that.
A modified version would keep the hard (or soft) limit, but would raise it slightly, so that rapid transition to superintelligence is possible, but the resulting superintelligence can’t run away fast in terms of capabilities (no near-term “intelligence explosion”). If one believes this modified version of the “low ceiling conjecture”, then subsequent AIs produced by humanity might indeed be relevant.
It’s possible, but I think it would require a modified version of the “low ceiling conjecture” to be true.
The standard “low ceiling conjecture” says that human-level intelligence is the hard (or soft) limit, and therefore it will be impossible (or would take a very long period of time) to move from human-level AI to superintelligence. I think most of us tend not to believe that.
A modified version would keep the hard (or soft) limit, but would raise it slightly, so that rapid transition to superintelligence is possible, but the resulting superintelligence can’t run away fast in terms of capabilities (no near-term “intelligence explosion”). If one believes this modified version of the “low ceiling conjecture”, then subsequent AIs produced by humanity might indeed be relevant.