The amount of compute required to emulate the human brain depends on the level of detail we want to emulate.
Back in 2008, Sandberg and Bostrom proposed the following values:
Level of emulation detail
FLOPS required to run the brain emulation in real-time
Analog network population model
10^15
Spiking neural network
10^18
Electrophysiology
10^22
Metabolome
10^25
Proteome
10^26
States of protein complexes
10^27
Distribution of protein complexes
10^30
Stochastic behavior of single molecules
10^43
Today I’ve encountered an interesting piece of data on GPT-3 (source):
GPT-3 required ~10^15 FLOPS for inference.
It required ~10^23 FLOPS to train it [Note: the training took some months. It would require ~10^30 FLOPS to train it from zero in one second]
As far as I know, GPT-3 was the first AI with the range and the quality of cognitive abilities comparable to the human brain (although still far from reaching the human level on many tasks).
Coincidentally(?), GPT-3 requires 10^15 − 10^30 FLOPS to operate at the brain’s speed, which is roughly the same amount of compute necessary to run a decent emulation of the human brain.
The range of possible compute is almost infinite (e.g. 10^100 FLOPS and beyond). Yet both intelligences are in the same relatively narrow range of 10^15 − 10^30 (assuming the human brain emulation doesn’t need to be nano-level detailed).
Is it a coincidence, or is there something deeper going on here?
This could be important for both understanding the human brain, and for predicting how far we are from the true AGI.
[Question] Is it a coincidence that GPT-3 requires roughly the same amount of compute as is necessary to emulate the human brain?
The amount of compute required to emulate the human brain depends on the level of detail we want to emulate.
Back in 2008, Sandberg and Bostrom proposed the following values:
Today I’ve encountered an interesting piece of data on GPT-3 (source):
GPT-3 required ~10^15 FLOPS for inference.
It required ~10^23 FLOPS to train it [Note: the training took some months. It would require ~10^30 FLOPS to train it from zero in one second]
As far as I know, GPT-3 was the first AI with the range and the quality of cognitive abilities comparable to the human brain (although still far from reaching the human level on many tasks).
Coincidentally(?), GPT-3 requires 10^15 − 10^30 FLOPS to operate at the brain’s speed, which is roughly the same amount of compute necessary to run a decent emulation of the human brain.
The range of possible compute is almost infinite (e.g. 10^100 FLOPS and beyond). Yet both intelligences are in the same relatively narrow range of 10^15 − 10^30 (assuming the human brain emulation doesn’t need to be nano-level detailed).
Is it a coincidence, or is there something deeper going on here?
This could be important for both understanding the human brain, and for predicting how far we are from the true AGI.