As I will reiterate probably for the thousandth time in these discussions, the point where anyone expected things to start happening quickly and discontinuously is when AGI gets competent enough to do AI R&D and perform recursive self-improvement.
Yeah, I agree that we are seeing a tiny bit of that happening.
Commenting a bit on the exact links you shared: The Alphachip stuff seems overstated from what I’ve heard from other people working in the space, “code being written by AI” is not a great proxy for AI doing AI R&D, and generating synthetic training data is a pretty narrow edge-case of AI R&D (though yeah, it does matter and is a substantial part for why I don’t expect a training data bottleneck contrary to what many people have been forecasting).
I have a hard time imagining there’s a magical threshold where we go from “AI is automating 99.99% of my work” to “AI is automating 100% of my work” and things suddenly go Foom (unless it’s for some other reason like “the AI built a nanobot swarm and turned the planet into computronium”). As it is, I would guess we are closer to “AI is automating 20% of my work” than “AI is automating 1% of my work”
It’s of course all a matter of degree. The concrete prediction Paul made was “doubling in 4 years before we see a doubling in 1 year”. I would currently be surprised (though not very surprised) if we see the world economy doubling at all before you get much faster growth (probably by taking humans out the loop completely).
AI is currently doing AI R&D.
Yeah, I agree that we are seeing a tiny bit of that happening.
Commenting a bit on the exact links you shared: The Alphachip stuff seems overstated from what I’ve heard from other people working in the space, “code being written by AI” is not a great proxy for AI doing AI R&D, and generating synthetic training data is a pretty narrow edge-case of AI R&D (though yeah, it does matter and is a substantial part for why I don’t expect a training data bottleneck contrary to what many people have been forecasting).
I have a hard time imagining there’s a magical threshold where we go from “AI is automating 99.99% of my work” to “AI is automating 100% of my work” and things suddenly go Foom (unless it’s for some other reason like “the AI built a nanobot swarm and turned the planet into computronium”). As it is, I would guess we are closer to “AI is automating 20% of my work” than “AI is automating 1% of my work”
It’s of course all a matter of degree. The concrete prediction Paul made was “doubling in 4 years before we see a doubling in 1 year”. I would currently be surprised (though not very surprised) if we see the world economy doubling at all before you get much faster growth (probably by taking humans out the loop completely).