No, I expect (absent agent foundations advances) people will build superintelligence before they understand the basic shape of things of which that AGI will consist. An illustrative example (though I don’t think this exact thing will happen): if the first superintelligence popped out of a genetic algorithm, then people would probably have no idea what pieces went into the thing by the time it exists.
No, I expect (absent agent foundations advances) people will build superintelligence before they understand the basic shape of things of which that AGI will consist. An illustrative example (though I don’t think this exact thing will happen): if the first superintelligence popped out of a genetic algorithm, then people would probably have no idea what pieces went into the thing by the time it exists.