Humans suck at arithmetic. Really suck. From comparison of current GPU’s to a human trying and failing to multiply 10 digit numbers in their head, we can conclude that something about humans, hardware or software, is Incredibly inefficient.
Almost all humans have roughly the same sized brain.
So even if Einsteins brain was operating at 100% efficiency, the brain of the average human is operating at a lot less.
ie intelligence is easy—it just takes enormous amounts of compute for training.
Making a technology work at all is generally easier than making it efficient.
Current scaling laws seem entirely consistent with us having found an inefficient algorithm that works at all.
Like chatGPT uses billions of floating point operations to do basic arithmetic mostly correctly. So it’s clear that the likes of chatGPT are also inefficient.
Now you can claim that chatGPT and humans are mostly efficient, but suddenly drop 10 orders of magnitude when confronted with a multiplication. But no really, they are pushing right up against the fundamental limits for everything that isn’t one of the most basic computational operations.
Humans suck at arithmetic. Really suck. From comparison of current GPU’s to a human trying and failing to multiply 10 digit numbers in their head, we can conclude that something about humans, hardware or software, is Incredibly inefficient.
Almost all humans have roughly the same sized brain.
So even if Einsteins brain was operating at 100% efficiency, the brain of the average human is operating at a lot less.
Making a technology work at all is generally easier than making it efficient.
Current scaling laws seem entirely consistent with us having found an inefficient algorithm that works at all.
Like chatGPT uses billions of floating point operations to do basic arithmetic mostly correctly. So it’s clear that the likes of chatGPT are also inefficient.
Now you can claim that chatGPT and humans are mostly efficient, but suddenly drop 10 orders of magnitude when confronted with a multiplication. But no really, they are pushing right up against the fundamental limits for everything that isn’t one of the most basic computational operations.