“Just like the smartest humans alive only a thousand times faster” is actually presented as a conservative scenario to pump intuition in the right direction. It’s almost certainly achievable by known physics, even if it would be very expensive and difficult for us to achieve directly.
An actual superintelligence will be strictly better than that, because its early iterations will design systems better than we can, and later iterations running on those systems will be able to design systems more effective than we can possibly imagine or even properly comprehend if it were explained to us. They might not be strictly faster, but speed is much easier to extrapolate and communicate than actual superhuman intelligence. People can understand what it means to think much faster in a way that they fundamentally can’t think of actually being smarter.
So in a way, the actual question asked here is irrelevant. A speedup is just an analogy to try to extrapolate to something—anything—that is vastly more capable than our thought processes. The reality would be far more powerful still in ways that we can’t comprehend.
“Just like the smartest humans alive only a thousand times faster” is actually presented as a conservative scenario to pump intuition in the right direction. It’s almost certainly achievable by known physics, even if it would be very expensive and difficult for us to achieve directly.
An actual superintelligence will be strictly better than that, because its early iterations will design systems better than we can, and later iterations running on those systems will be able to design systems more effective than we can possibly imagine or even properly comprehend if it were explained to us. They might not be strictly faster, but speed is much easier to extrapolate and communicate than actual superhuman intelligence. People can understand what it means to think much faster in a way that they fundamentally can’t think of actually being smarter.
So in a way, the actual question asked here is irrelevant. A speedup is just an analogy to try to extrapolate to something—anything—that is vastly more capable than our thought processes. The reality would be far more powerful still in ways that we can’t comprehend.
I am unconvinced by this.
I get your broader point though.
That said, I am still curious about how pragmatic speed superintelligences are in practice. I don’t think it’s an irrelevant question.