I guess that quantum computers halve the doubling time, as compared to a classical computer, because every extra qubit squares the available state space. This could give the factor two in the exponential of Moore’s law.
Quantum computing performance currently isn’t doubling but it isn’t jammed either. Decoherence is no longer considered to be a fundamental limit, it’s more a practical inconvenience. The change that brought this about was the invention of quantum error correcting codes.
However experimental physicists are still searching for the ideal practical implementation. You might compare the situation to that of the pre-silicon days of classical computing. Until this gets sorted I doubt there will be any Moore’s law type growth.
I guess that quantum computers halve the doubling time, as compared to a classical computer, because every extra qubit squares the available state space. This could give the factor two in the exponential of Moore’s law.
Quantum computing performance currently isn’t doubling but it isn’t jammed either. Decoherence is no longer considered to be a fundamental limit, it’s more a practical inconvenience. The change that brought this about was the invention of quantum error correcting codes.
However experimental physicists are still searching for the ideal practical implementation. You might compare the situation to that of the pre-silicon days of classical computing. Until this gets sorted I doubt there will be any Moore’s law type growth.
I looked at:
http://en.wikipedia.org/wiki/Quantum_error_correction
The bit about the threshold theorem looks interesting.
However, I would be more impressed by a working implementation ;-)