It just comes from treating classical computers as the correct measuring stick. It would be more precise to refer, as you do, to 18 months as the add one time than the doubling time. But if you do call it the doubling time, then for quantum computers, it becomes the 4x time. Of course, it’s not uniform—it doesn’t apply to problems in P.
With classical computers Moore’s law improves serial and parallel performance simulataneously—by making components smaller.
With quantum computers serial and parallel performance are decoupled—more qubits improves parallel performance and minaturisation has no effect on the number of qubits, but improves serial processing performance. So, there are two largely independent means of speeding up quantum computing. Which one supposedly doubles twice as fast as classical computers? Neither—AFAICS.
It just comes from treating classical computers as the correct measuring stick. It would be more precise to refer, as you do, to 18 months as the add one time than the doubling time. But if you do call it the doubling time, then for quantum computers, it becomes the 4x time. Of course, it’s not uniform—it doesn’t apply to problems in P.
With classical computers Moore’s law improves serial and parallel performance simulataneously—by making components smaller.
With quantum computers serial and parallel performance are decoupled—more qubits improves parallel performance and minaturisation has no effect on the number of qubits, but improves serial processing performance. So, there are two largely independent means of speeding up quantum computing. Which one supposedly doubles twice as fast as classical computers? Neither—AFAICS.
Sorry, my original response should have been “yes, you aren’t getting into the spirit of the counterfactual.”