Humans/human level can invent technology from distant future if given enough time. Nick Bostrom has a whole chapter about speed intelligence in Superintelligence.
The point is that none of the capabilities in the particular forms they are expressed in humans are necessary for AGI, and large collections of them are insufficient, as it’s only the long term potential that’s important. Because of speed disparity with humans, autonomous progress is all it takes to gain any other specific capabilities relatively quickly. So pointing to human capabilities becoming readily available can be misleading, really shouldn’t be called AGI, even if it’s an OK argument that we are getting there.
Humans/human level can invent technology from distant future if given enough time. Nick Bostrom has a whole chapter about speed intelligence in Superintelligence.
The point is that none of the capabilities in the particular forms they are expressed in humans are necessary for AGI, and large collections of them are insufficient, as it’s only the long term potential that’s important. Because of speed disparity with humans, autonomous progress is all it takes to gain any other specific capabilities relatively quickly. So pointing to human capabilities becoming readily available can be misleading, really shouldn’t be called AGI, even if it’s an OK argument that we are getting there.