“Now, an experiment has settled this controversy. It clearly shows that there is no such minimum energy limit and that a logically irreversible gate can be operated with an arbitrarily small energy expenditure. Simply put, it is not true that logical reversibility implies physical irreversibility, as Landauer wrote.”
Some of the limits of computation, how much you could theoretically do with a certain amount of energy are based on what appear to have been incorrect beliefs about information processing and entropy.
It will push the research towards “zero-power” computing: the search for new information processing devices that consume less energy. This is of strategic importance for the future of the entire ICT sector that has to deal with the problem of excess heat production during computation.
It will call for a deep revision of the “reversible computing” field. In fact, one of the main motivations for its own existence (the presence of a lower energy bound) disappears.
This will not have any practical consequences whatsoever, even in the long term. It is already possible to perform reversible computation (Paper by Bennett linked in the article) for which such lower bounds don’t apply. The idea is very simple: just make sure that your individual logic gates are reversible, so you can uncompute everything after reading out the results. This is most easily achieved by writing the gate’s output to a separate wire. For example an OR gate, instead of mapping 2 inputs to 1 output like
(x,y) --> (x OR y),
it would map 3 inputs to 3 outputs like
(x, y, z) --> (x,y, z XOR (x OR y)),
causing the gate to be its own inverse.
Secondly, I understand that the Landauer bound is so extremely small that worrying about it in practice is like worrying about the speed of light while designing an airplane.
Finally, I don’t know how controversial the Landauer bound is among physicists, but I’m skeptical in general of any experimental result that violates established theory. Recall that just a while ago there were some experiments that appeared to show FTL communication, but were ultimately a sensor/timing problem. I can imagine many ways in which measurement errors sneak their way in, given the very small amount of energy being measured here.
While you can always make the computation reversible it comes at a price: Carrying around larger and larger number of bits which take space and time to communicate and store.
I think that the Laundauer limit is controversial. But if it’s wrong, one should be able to explain on the level of theory. What ordinary models of physics say about their gate is much more convincing than an experiment. How did they design their gate if they don’t have a competing theory?
As far as I can see, the experiment has shown that what was considered to be the lower bound is actually not.
However I don’t understand how the claim of “no lower bound at all” necessarily follows. For all we know there is just a different, lower (lower bound).
If true this has some spectacular implications for computing (long term).
http://phys.org/news/2016-07-refutes-famous-physical.html
Some of the limits of computation, how much you could theoretically do with a certain amount of energy are based on what appear to have been incorrect beliefs about information processing and entropy.
This will not have any practical consequences whatsoever, even in the long term. It is already possible to perform reversible computation (Paper by Bennett linked in the article) for which such lower bounds don’t apply. The idea is very simple: just make sure that your individual logic gates are reversible, so you can uncompute everything after reading out the results. This is most easily achieved by writing the gate’s output to a separate wire. For example an OR gate, instead of mapping 2 inputs to 1 output like
(x,y) --> (x OR y),
it would map 3 inputs to 3 outputs like
(x, y, z) --> (x,y, z XOR (x OR y)),
causing the gate to be its own inverse.
Secondly, I understand that the Landauer bound is so extremely small that worrying about it in practice is like worrying about the speed of light while designing an airplane.
Finally, I don’t know how controversial the Landauer bound is among physicists, but I’m skeptical in general of any experimental result that violates established theory. Recall that just a while ago there were some experiments that appeared to show FTL communication, but were ultimately a sensor/timing problem. I can imagine many ways in which measurement errors sneak their way in, given the very small amount of energy being measured here.
While you can always make the computation reversible it comes at a price: Carrying around larger and larger number of bits which take space and time to communicate and store.
I think that the Laundauer limit is controversial. But if it’s wrong, one should be able to explain on the level of theory. What ordinary models of physics say about their gate is much more convincing than an experiment. How did they design their gate if they don’t have a competing theory?
As far as I can see, the experiment has shown that what was considered to be the lower bound is actually not.
However I don’t understand how the claim of “no lower bound at all” necessarily follows. For all we know there is just a different, lower (lower bound).
I found it odd as well but I think it’s because it implies that the theoretical reason for that lower bound may be invalid.
There’s likely going to turn out to be a different theoretical lower bound for some other reason but right now we don’t have that theoretical reason.