For example, if you subscribe to the MWI model, gaining access to the googolplex of the worlds created every femtosecond and harnessing their computational resources would effectively remove anything resembling a computational speed limit.
So, you can think of this as what quantum computers do, and there’s still a pretty normal speed limit. Because all (traditional) interpretations of quantum mechanics run off the exact same math, a good test to apply in these cases is that if it only seems to work in one interpretation, you’ve probably made a mistake.
And of course, unlike Lord Kelvin’s famous claim, we didn’t have to discover any new and unexpected physics to build heavier than air flying machines. Carrier’s statement is statement literally correct, then—technology will not get you around quantum-mechanical limits, such as they are.
Because all (traditional) interpretations of quantum mechanics run off the exact same math, a good test to apply in these cases is that if it only seems to work in one interpretation, you’ve probably made a mistake.
True enough, I was referring to the next breakthrough in quantum physics, which, in my estimation, is likely to happen before we reach the current quantum limits, when the interpretations might actually become useful models.
we didn’t have to discover any new and unexpected physics to build heavier than air flying machines.
Sometimes technological advances are also unexpected. Remember when 9600bps was considered the limit for phone lines? We are 20000 times faster than that now.
Actually, we’re only about five times faster than that, and the real (Shannon) limit for analog phone lines is somewhere in the 60-100 kbps range. It’s not fair to compare a modulation capable of using megahertz of bandwidth on a short unfiltered line with a modulation designed explicitly to be bounded by a 3 khz voice path and work across a pure analog channel hundreds or thousands of kilometers in length.
Some background:
9600 bps modems used around 2.7 khz of bandwidth, and had low grade error correction that allowed fairly reliable connectivity but didn’t have a huge coding gain. This was extended up to 19200 bps using V.32ter, but that never actually caught on as a standard and work on V.34 began.
V.34, for all its problems and imperfections, to this day remains a work of art. It can use almost the entire available spectrum of an analog line (all of 3.5 khz or so), can compensate for many different types of line distortion, and automatically adjusts to changing line conditions to maintain its connection. It really is a piece of high technology software—and its primary design criteria is to push bits reliably through a comm channel that’s not a whole lot better than two tin cans and some string.
(Note that V.90 and V.92 are faster than V.34, but they are digital standards, and make use of much stronger constraints on line quality. They also directly operate on digital data instead of having to do an extra A/D transform. The techniques and assumptions used in these standards are very different from V.34 and allow higher data rates, but when those assumptions are violated, V.90/92 fall back to V.34.)
The max data rate for V.34 is 33.6 kbps. There are a lot of improvements that could be made to V.34 with modern technology, the most significant of which would be use of better error correction. But even with all the resources of mankind thrown at the problem, I would be shocked if we could double the average data rate without loosening the channel constraints.
I agree with your caveats, but my point was, to quote Wikipedia, “For many years, most engineers considered this rate to be the limit of data communications over telephone networks.” Yet it only took some extra technology, and no new advances in physics, to increase the effective throughput by 4 orders of magnitude (and counting). It might well happen that the apparent quantum limit on computer performance is only a technological obstacle, not a fundamental one.
So, you can think of this as what quantum computers do, and there’s still a pretty normal speed limit. Because all (traditional) interpretations of quantum mechanics run off the exact same math, a good test to apply in these cases is that if it only seems to work in one interpretation, you’ve probably made a mistake.
And of course, unlike Lord Kelvin’s famous claim, we didn’t have to discover any new and unexpected physics to build heavier than air flying machines. Carrier’s statement is statement literally correct, then—technology will not get you around quantum-mechanical limits, such as they are.
True enough, I was referring to the next breakthrough in quantum physics, which, in my estimation, is likely to happen before we reach the current quantum limits, when the interpretations might actually become useful models.
Sometimes technological advances are also unexpected. Remember when 9600bps was considered the limit for phone lines? We are 20000 times faster than that now.
Actually, we’re only about five times faster than that, and the real (Shannon) limit for analog phone lines is somewhere in the 60-100 kbps range. It’s not fair to compare a modulation capable of using megahertz of bandwidth on a short unfiltered line with a modulation designed explicitly to be bounded by a 3 khz voice path and work across a pure analog channel hundreds or thousands of kilometers in length.
Some background:
9600 bps modems used around 2.7 khz of bandwidth, and had low grade error correction that allowed fairly reliable connectivity but didn’t have a huge coding gain. This was extended up to 19200 bps using V.32ter, but that never actually caught on as a standard and work on V.34 began.
V.34, for all its problems and imperfections, to this day remains a work of art. It can use almost the entire available spectrum of an analog line (all of 3.5 khz or so), can compensate for many different types of line distortion, and automatically adjusts to changing line conditions to maintain its connection. It really is a piece of high technology software—and its primary design criteria is to push bits reliably through a comm channel that’s not a whole lot better than two tin cans and some string.
(Note that V.90 and V.92 are faster than V.34, but they are digital standards, and make use of much stronger constraints on line quality. They also directly operate on digital data instead of having to do an extra A/D transform. The techniques and assumptions used in these standards are very different from V.34 and allow higher data rates, but when those assumptions are violated, V.90/92 fall back to V.34.)
The max data rate for V.34 is 33.6 kbps. There are a lot of improvements that could be made to V.34 with modern technology, the most significant of which would be use of better error correction. But even with all the resources of mankind thrown at the problem, I would be shocked if we could double the average data rate without loosening the channel constraints.
I agree with your caveats, but my point was, to quote Wikipedia, “For many years, most engineers considered this rate to be the limit of data communications over telephone networks.” Yet it only took some extra technology, and no new advances in physics, to increase the effective throughput by 4 orders of magnitude (and counting). It might well happen that the apparent quantum limit on computer performance is only a technological obstacle, not a fundamental one.