Your instinct is right. The Landauer limit says that it takes at least kbTln2 energy to erase 1 bit of information, which is necessary to run a function which outputs 1 bit (to erase the output bit). The important thing to note is that it scales with temperature T (measured in an absolute scale). Human brains operate at 310 Kelvin. Ordinary chips can already operate down to around ~230 Kelvin, and there is even a recently developed chip which operates at ~0.02 Kelvin.
So human brains being near the thermodynamic limit in this case means very little about what sort of efficiencies are possible in practice.
Your point about skull-sizes [being bounded by childbirth death risk] seems very strong for evolutionary reasons, and to which I would also add the fact that bird brains seem to do similar amounts of cognition (to smallish mammals) in a much more compact volume without having substantially higher body temperatures (~315 Kelvin).
Cooling the computer doesn’t let you get around the Landauer limit! The savings in energy you get by erasing bits at low temperature are offset by the energy you need to dissipate to keep your computer cold. (Erasing a bit at low temperature still generates some heat, and when you work out how much energy your refrigerator has to use to get rid of that heat, it turns out that you must dissipate the same amount as the Landauer limit says you’d have to if you just erased the bit at ambient temperatures.) To get real savings, you have to actually put your computer in an environment that is naturally colder. For example, if you could put a computer in deep space, that would work.
On the other hand, there might also be other good reasons to keep a computer cold, for example if you want to lower the voltage needed to represent a bit, then keeping your computer cold would plausibly help with that. It just won’t reduce your Landauer-limit-imposed power bill.
None of this is to say that I agree with the rest of Jacob’s analysis of thermodynamic efficiency, I believe he’s made a couple of shaky assumptions and one actual mistake. Since this is getting a lot of attention, I might write a post on it.
Deep space is a poor medium as the only energy dissipation there is radiation, which is slower than convection in Earth. Vacuums are typically used to insulate things (thermos).
That is true, and I concede that that weakens my point.
It still seems to be the case that you could get a ~35% efficiency increase by operating in e.g. Antarctica. I also have this intuition I’ll need to think more about that there are trade-offs with the Landauer limit that could get substantial gains by separating things that are biologically constrained to be close… similar to how a human with an air conditioner can thrive in much hotter environments (using more energy overall, but not energy that has to be in thermal contact with the brain via e.g. the same circulatory system).
Norway/sweden do happen to be currently popular datacenter building locations, but more for cheap power than cooling from what I understand. The problem with Antarctica would be terrible solar production for much of the year.
You can play the same game in the other direction. Given a cold source, you can run your chips hot, and use a steam engine to recapture some of the heat.
Your instinct is right. The Landauer limit says that it takes at least kbTln2 energy to erase 1 bit of information, which is necessary to run a function which outputs 1 bit (to erase the output bit). The important thing to note is that it scales with temperature T (measured in an absolute scale). Human brains operate at 310 Kelvin. Ordinary chips can already operate down to around ~230 Kelvin, and there is even a recently developed chip which operates at ~0.02 Kelvin.
So human brains being near the thermodynamic limit in this case means very little about what sort of efficiencies are possible in practice.
Your point about skull-sizes [being bounded by childbirth death risk] seems very strong for evolutionary reasons, and to which I would also add the fact that bird brains seem to do similar amounts of cognition (to smallish mammals) in a much more compact volume without having substantially higher body temperatures (~315 Kelvin).
Cooling the computer doesn’t let you get around the Landauer limit! The savings in energy you get by erasing bits at low temperature are offset by the energy you need to dissipate to keep your computer cold. (Erasing a bit at low temperature still generates some heat, and when you work out how much energy your refrigerator has to use to get rid of that heat, it turns out that you must dissipate the same amount as the Landauer limit says you’d have to if you just erased the bit at ambient temperatures.) To get real savings, you have to actually put your computer in an environment that is naturally colder. For example, if you could put a computer in deep space, that would work.
On the other hand, there might also be other good reasons to keep a computer cold, for example if you want to lower the voltage needed to represent a bit, then keeping your computer cold would plausibly help with that. It just won’t reduce your Landauer-limit-imposed power bill.
None of this is to say that I agree with the rest of Jacob’s analysis of thermodynamic efficiency, I believe he’s made a couple of shaky assumptions and one actual mistake. Since this is getting a lot of attention, I might write a post on it.
Deep space is a poor medium as the only energy dissipation there is radiation, which is slower than convection in Earth. Vacuums are typically used to insulate things (thermos).
In a room temp bath this always costs more energy—there is no free lunch in cooling. However in the depths of outer space this may become relevant.
That is true, and I concede that that weakens my point.
It still seems to be the case that you could get a ~35% efficiency increase by operating in e.g. Antarctica. I also have this intuition I’ll need to think more about that there are trade-offs with the Landauer limit that could get substantial gains by separating things that are biologically constrained to be close… similar to how a human with an air conditioner can thrive in much hotter environments (using more energy overall, but not energy that has to be in thermal contact with the brain via e.g. the same circulatory system).
Norway/sweden do happen to be currently popular datacenter building locations, but more for cheap power than cooling from what I understand. The problem with Antarctica would be terrible solar production for much of the year.
You can play the same game in the other direction. Given a cold source, you can run your chips hot, and use a steam engine to recapture some of the heat.
The Landauer limit still applies.