With recursives searches for an AGI architecture, once a suitable test bench exists (“a model doing well on this bench is an AGI”), someone could automate searching the possibility space of possible architectures.
Most AI papers published reuse techniques from a finite set. The modern ones often use several techniques to get sota results. Therefore, if you built a composable library of software modules that can apply any technique from all papers, adjusting for shape/data types/quantization/etc, combinations of modules from that library would allow for use of all known techniques, as well as a very large number of combinations not yet tried.
A recursive search—with a large compute budget—using the currently best scoring AGIs on AGI test bench to select new search coordinates from the possibility space—could search more possible AI techniques than all human efforts since the beginning of the field, within a year or 2.
That’s what matters. If that search gets performed in 2030, AGI in 2032. If it’s done in 2098, AGI in 2100.
Like any exponential process, all the progress happens right at the end.
With recursives searches for an AGI architecture, once a suitable test bench exists (“a model doing well on this bench is an AGI”), someone could automate searching the possibility space of possible architectures.
Most AI papers published reuse techniques from a finite set. The modern ones often use several techniques to get sota results. Therefore, if you built a composable library of software modules that can apply any technique from all papers, adjusting for shape/data types/quantization/etc, combinations of modules from that library would allow for use of all known techniques, as well as a very large number of combinations not yet tried.
A recursive search—with a large compute budget—using the currently best scoring AGIs on AGI test bench to select new search coordinates from the possibility space—could search more possible AI techniques than all human efforts since the beginning of the field, within a year or 2.
That’s what matters. If that search gets performed in 2030, AGI in 2032. If it’s done in 2098, AGI in 2100.
Like any exponential process, all the progress happens right at the end.