Sure! I argue that we just don’t know whether such a thing as “much more intelligent than humans” can exist. Millions of years of monkey evolution have increased human IQ to the 50-200 range. Perhaps that can go 1000x, perhaps it would level of at 210. The AGI concept makes the assumption that it can go to a big number, which might be wrong.
We don’t know, true. But given the possible space of limiting parameters it seems unlikely that humans are anywhere near the limits. We’re evolved systems, evolved under conditions in which intelligence was far from the most important priority.
And of course under the usual evolutionary constraints (suboptimal lock-ins like backward wired photoreceptors in the retina, the usual limited range of biological materials—nothing like transistors or macro scale wheels, etc.).
And by all reports John von Neumann was barely within the “human” range, yet seemed pretty stable. He came remarkably close to taking over the world, despite there being only one of him and not putting any effort into it.
I think all you’re saying is there’s a small chance it’s not possible.
Sure! I argue that we just don’t know whether such a thing as “much more intelligent than humans” can exist. Millions of years of monkey evolution have increased human IQ to the 50-200 range. Perhaps that can go 1000x, perhaps it would level of at 210. The AGI concept makes the assumption that it can go to a big number, which might be wrong.
We don’t know, true. But given the possible space of limiting parameters it seems unlikely that humans are anywhere near the limits. We’re evolved systems, evolved under conditions in which intelligence was far from the most important priority.
And of course under the usual evolutionary constraints (suboptimal lock-ins like backward wired photoreceptors in the retina, the usual limited range of biological materials—nothing like transistors or macro scale wheels, etc.).
And by all reports John von Neumann was barely within the “human” range, yet seemed pretty stable. He came remarkably close to taking over the world, despite there being only one of him and not putting any effort into it.
I think all you’re saying is there’s a small chance it’s not possible.
How did von Neumann come close to taking over the world? Perhaps Hitler, but von Neumann?
Even without having a higher IQ than a peak human, an AGI that merely ran 1000x faster would be transformative.