whereas Paul Christiano’s view on soft takeoff introduced the idea that “takeoff” already starts before AGI.
I came up with a cool term I hope to see used for this. “Singularity Criticality”. In my mind I’m seeing plutonium start to glow as it edges over the line for a critical mass.
What causes this is that AGI is not really a singleton, it is an integrated set of separate components that individually handle different elements of the AGI’s cognition. Note that even “AGI from scaled up LLMs” will still have multiple components: multiple buffers, specialized vision and motion planning modules, long term memory storage, tool modules, and so on.
As a result, long before we know how to build the integrated system, we will have separate “AGI grade” components, and this is the present reality. We have many RL agents that are superhuman in ability and thus AGI grade.
Using those components we can automate/accelerate some of the tasks needed to reach AGI, so progress accelerates even without AGI existing. The existence of pre-AGI POC modules also increases human effort, financial investment, and increase in the production of compute hardware.
Anyways Singularity Criticality is empirical reality, it’s observable.
whereas Paul Christiano’s view on soft takeoff introduced the idea that “takeoff” already starts before AGI.
I came up with a cool term I hope to see used for this. “Singularity Criticality”. In my mind I’m seeing plutonium start to glow as it edges over the line for a critical mass.
What causes this is that AGI is not really a singleton, it is an integrated set of separate components that individually handle different elements of the AGI’s cognition. Note that even “AGI from scaled up LLMs” will still have multiple components: multiple buffers, specialized vision and motion planning modules, long term memory storage, tool modules, and so on.
As a result, long before we know how to build the integrated system, we will have separate “AGI grade” components, and this is the present reality. We have many RL agents that are superhuman in ability and thus AGI grade.
Using those components we can automate/accelerate some of the tasks needed to reach AGI, so progress accelerates even without AGI existing. The existence of pre-AGI POC modules also increases human effort, financial investment, and increase in the production of compute hardware.
Anyways Singularity Criticality is empirical reality, it’s observable.