The post is making somewhat outlandish claims about thermodynamics. My initial response was along the lines of “of course this is wrong. Moving on.” I gave it another look today. In one of the first sections I found (what I think is) a crucial mistake. As such, I didn’t read the rest. I assume it is also wrong.
The original post said:
A non-superconducting electronic wire (or axon) dissipates energy according to the same Landauer limit per minimal wire element. Thus we can estimate a bound on wire energy based on the minimal assumption of 1 minimal energy unit Eb per bit per fundamental device tile, where the tile size for computation using electrons is simply the probabilistic radius or De Broglie wavelength of an electron[7:1], which is conveniently ~1nm for 1eV electrons, or about ~3nm for 0.1eV electrons. Silicon crystal spacing is about ~0.5nm and molecules are around ~1nm, all on the same scale.
Thus the fundamental (nano) wire energy is: ~1 Eb/bit/nm, with Eb in the range of 0.1eV (low reliability) to 1eV (high reliability).
The predicted wire energy is 10−19J/bit/nm or ~100 fJ/bit/mm for semi-reliable signaling at 1V with Eb = 1eV, down to ~10 fJ/bit/mm at 100mV with complex error correction, which is an excellent fit for actual interconnect wire energy[8][9][10][11], [...]
The measured/simulated interconnect wire energies from the citations in the realm of 10s-100s of fJ/bit/mm are a result of physical properties of the interconnects. These include things like resistances (they’re very small wires) and stray capacitances. In principle, these numbers could be made basically arbitrarily worse by using smaller (cross sectional) interconnects, more resistive materials, or tighter packing of components. They can also be made significantly better, especially if you’re allowed to consider alternative materials. Importantly, this loss can be dramatically reduced by reducing the operating voltage of the system, but some components do not work well at lower voltages, so there’s a tradeoff.
… and we’re supposed to compare that number with Eb/bit/λ. I might be willing to buy the wide range of Eb, but the choice of de Broglie wavelength as “minimal wire element” is completely arbitrary. The author seems to know this, because they give a few more examples of length scales that are around a nanometer. I can do that too: The spacing of conduction electrons in copper (n−1/3) is roughly 0.2 nm. The mean free path of electrons in copper is a few nm. None of that matters, because signals do not propagate through wires by hopping from electron to electron. There is a complicated dance of electric field, magnetic field, conducting electrons as well as dielectrics that all work together to make signals move. The equation is asserting none of that matters, and it is simply unphysical. Sorry, there’s not a nice way to put that.
The author seems to want us to compare the two equations, but they are truly two different things. I can communicate the same information in a circuit (Eb/bit/λ held fixed) but dramatically vary the cited “excellent fit” numbers by orders of magnitude by changing their material or lowering their voltage.
The Landauer energy is very, very small compared to just about every other energy that we care about. It is basically a non-factor in all but a few very esoteric experiments. 20 meV is a decent chunk of energy if it is given to every electron in a system, but it is extremely small as a one-off. It is absurd to think that computers, with their resistive wires and leaky gates, are anywhere near perfect efficiency. It is even more absurd to think that brains, squishy beasts that literally physically pump ions across membranes and back, are anywhere near that limit.
Looking through the comments, I now see that another user has correctly pointed out the same mistakes. See here, and the comments under that. Give them the $250. EY also pointed out the absurdity of brains being considered anywhere near efficient. Nice work!
The post is making somewhat outlandish claims about thermodynamics. My initial response was along the lines of “of course this is wrong. Moving on.” I gave it another look today. In one of the first sections I found (what I think is) a crucial mistake. As such, I didn’t read the rest. I assume it is also wrong.
The original post said:
The measured/simulated interconnect wire energies from the citations in the realm of 10s-100s of fJ/bit/mm are a result of physical properties of the interconnects. These include things like resistances (they’re very small wires) and stray capacitances. In principle, these numbers could be made basically arbitrarily worse by using smaller (cross sectional) interconnects, more resistive materials, or tighter packing of components. They can also be made significantly better, especially if you’re allowed to consider alternative materials. Importantly, this loss can be dramatically reduced by reducing the operating voltage of the system, but some components do not work well at lower voltages, so there’s a tradeoff.
… and we’re supposed to compare that number with Eb/bit/λ. I might be willing to buy the wide range of Eb, but the choice of de Broglie wavelength as “minimal wire element” is completely arbitrary. The author seems to know this, because they give a few more examples of length scales that are around a nanometer. I can do that too: The spacing of conduction electrons in copper (n−1/3) is roughly 0.2 nm. The mean free path of electrons in copper is a few nm. None of that matters, because signals do not propagate through wires by hopping from electron to electron. There is a complicated dance of electric field, magnetic field, conducting electrons as well as dielectrics that all work together to make signals move. The equation is asserting none of that matters, and it is simply unphysical. Sorry, there’s not a nice way to put that.
The author seems to want us to compare the two equations, but they are truly two different things. I can communicate the same information in a circuit (Eb/bit/λ held fixed) but dramatically vary the cited “excellent fit” numbers by orders of magnitude by changing their material or lowering their voltage.
The Landauer energy is very, very small compared to just about every other energy that we care about. It is basically a non-factor in all but a few very esoteric experiments. 20 meV is a decent chunk of energy if it is given to every electron in a system, but it is extremely small as a one-off. It is absurd to think that computers, with their resistive wires and leaky gates, are anywhere near perfect efficiency. It is even more absurd to think that brains, squishy beasts that literally physically pump ions across membranes and back, are anywhere near that limit.
Looking through the comments, I now see that another user has correctly pointed out the same mistakes. See here, and the comments under that. Give them the $250. EY also pointed out the absurdity of brains being considered anywhere near efficient. Nice work!