I would be very interested if someone could explain where this huge discrepancy comes from. (One estimate is equating synapses with parameters, while this one is based on FLOPS. But there shouldn’t be such a huge difference.)
The range of possible compute is almost infinite (e.g. 10^100 FLOPS and beyond). Yet both intelligences are in the same relatively narrow range of 10^15 − 10^30
10^15 − 10^30 is not at all a narrow range! So depending on what the ‘real’ answer is, there could be as little as zero discrepancy between the ratios implied by these two posts, or a huge amount. If we decide that GPT-3 uses 10^15 FLOPS (the inference amount) and meanwhile the first “decent” simulation of the human brain is the “Spiking neural network” (10^18 FLOPS according to the table), then the human-to-GPT ratio is 10^18 / 10^15 which is almost exactly 140k / 175. Whereas if you actually need the single molecules version of the brain (10^43 FLOPS), there’s suddenly an extra factor of ten septillion lying around.
I didn’t read that post. should I? is it more than a joke?
edit: I read it. it was a lot shorter than I expected, sometimes I’m a dumbass about reading posts and forget to check length. it’s a really simple point, made in the first words, and I figured there would be more to it than that for some reason. there isn’t.
I wouldn’t call it a huge discrepancy. If both values are correct, it means the human brain requires only 10² − 10³ more compute than GPT-3.
The difference could’ve been in dozens or even hundreds of OOMs, but it’s only 2 − 3, which is quite interesting. Why the difference in compute is so small, if the nature of the two systems is so different?
There was a recent post estimating that GTP-3 is equivalent to about 175 bees. There is also a comment there asserting that a human is about 140k bees.
I would be very interested if someone could explain where this huge discrepancy comes from. (One estimate is equating synapses with parameters, while this one is based on FLOPS. But there shouldn’t be such a huge difference.)
10^15 − 10^30 is not at all a narrow range! So depending on what the ‘real’ answer is, there could be as little as zero discrepancy between the ratios implied by these two posts, or a huge amount. If we decide that GPT-3 uses 10^15 FLOPS (the inference amount) and meanwhile the first “decent” simulation of the human brain is the “Spiking neural network” (10^18 FLOPS according to the table), then the human-to-GPT ratio is 10^18 / 10^15 which is almost exactly 140k / 175. Whereas if you actually need the single molecules version of the brain (10^43 FLOPS), there’s suddenly an extra factor of ten septillion lying around.
Author of the post here — I don’t think there’s a huge discrepancy here, 140k/175 is clearly within the range of uncertainty of the estimates here!
That being said the Bee post really shouldn’t be taken too seriously. 1 synapse is not exactly one float 16 or int8 parameter, etc
I didn’t read that post. should I? is it more than a joke?
edit: I read it. it was a lot shorter than I expected, sometimes I’m a dumbass about reading posts and forget to check length. it’s a really simple point, made in the first words, and I figured there would be more to it than that for some reason. there isn’t.
I wouldn’t call it a huge discrepancy. If both values are correct, it means the human brain requires only 10² − 10³ more compute than GPT-3.
The difference could’ve been in dozens or even hundreds of OOMs, but it’s only 2 − 3, which is quite interesting. Why the difference in compute is so small, if the nature of the two systems is so different?