I meant ‘copying’ above only necessary in the human case to escape the slow evolving biological brain. While it is certainly available to a hypothetical AGI, it is not strictly necessary for self-improvement (at least copying of the whole AGI isn’t)
I meant ‘copying’ above only necessary in the human case to escape the slow evolving biological brain. While it is certainly available to a hypothetical AGI, it is not strictly necessary for self-improvement (at least copying of the whole AGI isn’t)