I think your thermodynamics is dubious. Firstly, it is thermodynamically possible to run error free computations very close to the thermodynamic limits. This just requires the energy used to represent a bit to be significantly larger than the energy dissipated as waste heat when a bit is deleted.
Considering a cooling fluid of water flowing at 100m/s through fractally structured pipes of cross section 0.01m^2 and being heated from 0C to 100C, the cooling power is 400 megawatts.
I think that superconducting chips are in labs today. The confidant assertion that superconductive reversible computing (or quantum computing) won’t appear before AGI is dubious at best.
Finally, have you heard of super-resolution microscopy https://en.wikipedia.org/wiki/Super-resolution_microscopy ? There was what appeared to be a fundamental limit on microscopes that was based on the wavelength of light. Physicists found several different ways to get images beyond that. I think there are quite a lot of cases where X is technically allowed under the letter of the mathematical equations, but feels really like cheating. This is the sort of analysis that would have ruled out the possibility. (And did rule out a similar possibility of components far smaller than a photons wavelength communicating with photons) This kind of rough analysis can easily prove possibility, but it takes a pedant with the security mindset and a keen knowledge of exactly what the limits say to show anything is impossible. So not only are there reversible computing and quantum computing, there are other ideas out there that skirt around physical limits on a technicality that haven’t been invented yet.
I think that superconducting chips are in labs today. The confidant assertion that superconductive reversible computing (or quantum computing) won’t appear before AGI is dubious at best.
I’m reasonably well read on reversible computing. It’s dramatically less area efficient, and requires a new radically restricted programming model—much more restrictive than the serial to parallel programming transition. I will take and win any bets on reversible computing becoming a multi-billion dollar practical industry before AGI.
Firstly, it is thermodynamically possible to run error free computations very close to the thermodynamic limits. This just requires the energy used to represent a bit to be significantly larger than the energy dissipated as waste heat when a bit is deleted.
In theory it’s possible to perform computation without erasing bits—ie reversible computation, as mentioned in the article. And sure you can use more than necessary to represent a bit, but not much point in that, when you could instead use the minimum Landauer bound amount.
I think your thermodynamics is dubious. Firstly, it is thermodynamically possible to run error free computations very close to the thermodynamic limits. This just requires the energy used to represent a bit to be significantly larger than the energy dissipated as waste heat when a bit is deleted.
Considering a cooling fluid of water flowing at 100m/s through fractally structured pipes of cross section 0.01m^2 and being heated from 0C to 100C, the cooling power is 400 megawatts.
I think that superconducting chips are in labs today. The confidant assertion that superconductive reversible computing (or quantum computing) won’t appear before AGI is dubious at best.
Finally, have you heard of super-resolution microscopy https://en.wikipedia.org/wiki/Super-resolution_microscopy ? There was what appeared to be a fundamental limit on microscopes that was based on the wavelength of light. Physicists found several different ways to get images beyond that. I think there are quite a lot of cases where X is technically allowed under the letter of the mathematical equations, but feels really like cheating. This is the sort of analysis that would have ruled out the possibility. (And did rule out a similar possibility of components far smaller than a photons wavelength communicating with photons) This kind of rough analysis can easily prove possibility, but it takes a pedant with the security mindset and a keen knowledge of exactly what the limits say to show anything is impossible. So not only are there reversible computing and quantum computing, there are other ideas out there that skirt around physical limits on a technicality that haven’t been invented yet.
I’m reasonably well read on reversible computing. It’s dramatically less area efficient, and requires a new radically restricted programming model—much more restrictive than the serial to parallel programming transition. I will take and win any bets on reversible computing becoming a multi-billion dollar practical industry before AGI.
In theory it’s possible to perform computation without erasing bits—ie reversible computation, as mentioned in the article. And sure you can use more than necessary to represent a bit, but not much point in that, when you could instead use the minimum Landauer bound amount.