I would like to propose a conjecture for AI scaling:
Weak Scaling Conjecture: Scaling the parameters/compute plus data to within 1 order of magnitude of human synapses is enough to get AI as good as a human in languages.
Strong Scaling Conjecture: No matter which form of NN we use, as long as we get to within an order of magnitude in parameters/compute plus to within 1 order of magnitude of human synapses is enough to make an AGI.
I would like to propose a conjecture for AI scaling:
Weak Scaling Conjecture: Scaling the parameters/compute plus data to within 1 order of magnitude of human synapses is enough to get AI as good as a human in languages.
Strong Scaling Conjecture: No matter which form of NN we use, as long as we get to within an order of magnitude in parameters/compute plus to within 1 order of magnitude of human synapses is enough to make an AGI.