I don’t think transistors have too much to do with neurons beyond the abstract observation that neurons most likely store information by establishing gradients of potential energy. When the stored information needs to be updated, that means some gradients have to get moved around, and if I had to imagine how this works inside a cell it would probably involve some kind of proton pump operating across a membrane or something like that. That’s going to be functionally pretty similar to a capacitor, and discharging & recharging it probably carries similar free energy costs.
I think what I don’t understand is why you’re defaulting to the assumption that the brain has a way to store and update information that’s much more efficient than what we’re able to do. That doesn’t sound like a state of ignorance to me; it seems like you wouldn’t hold this belief if you didn’t think there was a good reason to do so.
I think what I don’t understand is why you’re defaulting to the assumption that the brain has a way to store and update information that’s much more efficient than what we’re able to do. That doesn’t sound like a state of ignorance to me; it seems like you wouldn’t hold this belief if you didn’t think there was a good reason to do so.
It’s my assumption because our brains are AGI for ~20 W.
In contrast, many kW of GPUs are not AGI.
Therefore, it seems like brains have a way of storing and updating information that’s much more efficient than what we’re able to do.
Of course, maybe I’m wrong and it’s due to a lack of training or lack of data or lack of algorithms, rather than lack of hardware.
DNA storage is way more information dense than hard drives, for example.
It’s my assumption because our brains are AGI for ~20 W.
I think that’s probably the crux. I think the evidence that the brain is not performing that much computation is reasonably good, so I attribute the difference to algorithmic advantages the brain has, particularly ones that make the brain more data efficient relative to today’s neural networks.
The brain being more data efficient I think is hard to dispute, but of course you can argue that this is simply because the brain is doing a lot more computation internally to process the limited amount of data it does see. I’m more ready to believe that the brain has some software advantage over neural networks than to believe that it has an enormous hardware advantage.
I don’t think transistors have too much to do with neurons beyond the abstract observation that neurons most likely store information by establishing gradients of potential energy. When the stored information needs to be updated, that means some gradients have to get moved around, and if I had to imagine how this works inside a cell it would probably involve some kind of proton pump operating across a membrane or something like that. That’s going to be functionally pretty similar to a capacitor, and discharging & recharging it probably carries similar free energy costs.
I think what I don’t understand is why you’re defaulting to the assumption that the brain has a way to store and update information that’s much more efficient than what we’re able to do. That doesn’t sound like a state of ignorance to me; it seems like you wouldn’t hold this belief if you didn’t think there was a good reason to do so.
It’s my assumption because our brains are AGI for ~20 W.
In contrast, many kW of GPUs are not AGI.
Therefore, it seems like brains have a way of storing and updating information that’s much more efficient than what we’re able to do.
Of course, maybe I’m wrong and it’s due to a lack of training or lack of data or lack of algorithms, rather than lack of hardware.
DNA storage is way more information dense than hard drives, for example.
I think that’s probably the crux. I think the evidence that the brain is not performing that much computation is reasonably good, so I attribute the difference to algorithmic advantages the brain has, particularly ones that make the brain more data efficient relative to today’s neural networks.
The brain being more data efficient I think is hard to dispute, but of course you can argue that this is simply because the brain is doing a lot more computation internally to process the limited amount of data it does see. I’m more ready to believe that the brain has some software advantage over neural networks than to believe that it has an enormous hardware advantage.