It’s my assumption because our brains are AGI for ~20 W.
I think that’s probably the crux. I think the evidence that the brain is not performing that much computation is reasonably good, so I attribute the difference to algorithmic advantages the brain has, particularly ones that make the brain more data efficient relative to today’s neural networks.
The brain being more data efficient I think is hard to dispute, but of course you can argue that this is simply because the brain is doing a lot more computation internally to process the limited amount of data it does see. I’m more ready to believe that the brain has some software advantage over neural networks than to believe that it has an enormous hardware advantage.
I think that’s probably the crux. I think the evidence that the brain is not performing that much computation is reasonably good, so I attribute the difference to algorithmic advantages the brain has, particularly ones that make the brain more data efficient relative to today’s neural networks.
The brain being more data efficient I think is hard to dispute, but of course you can argue that this is simply because the brain is doing a lot more computation internally to process the limited amount of data it does see. I’m more ready to believe that the brain has some software advantage over neural networks than to believe that it has an enormous hardware advantage.