This narrative (on timing) promotes building $150bn training systems in 2026-2027. AGI is nigh, therefore it makes sense to build them. If they aren’t getting built, that might be the reason AGI hasn’t arrived yet, so build them already (implies the narrative).
Actual knowledge that this last step of scaling is just enough to be relevant doesn’t seem likely. This step of scaling seems to be beyond what happens by default, so a last push to get it done might be necessary. And the step after it won’t be possible to achieve with mere narrative. While funding keeps scaling, the probability of triggering an intelligence explosion is higher; once it stops scaling, the probability (per year) goes down (if intelligence hasn’t exploded by then). In this sense the narrative has a point.
This narrative (on timing) promotes building $150bn training systems in 2026-2027. AGI is nigh, therefore it makes sense to build them. If they aren’t getting built, that might be the reason AGI hasn’t arrived yet, so build them already (implies the narrative).
Actual knowledge that this last step of scaling is just enough to be relevant doesn’t seem likely. This step of scaling seems to be beyond what happens by default, so a last push to get it done might be necessary. And the step after it won’t be possible to achieve with mere narrative. While funding keeps scaling, the probability of triggering an intelligence explosion is higher; once it stops scaling, the probability (per year) goes down (if intelligence hasn’t exploded by then). In this sense the narrative has a point.