OK, but in this case I’m trying to imagine something that’s not significantly smarter than humans. So it probably can’t think of any self-improvement ideas that an AI scientist wouldn’t have thought of already, and even if it did, it wouldn’t have the ability to implement them without first getting access to huge supercomputers to re-train itself. Right?
OK, but in this case I’m trying to imagine something that’s not significantly smarter than humans. So it probably can’t think of any self-improvement ideas that an AI scientist wouldn’t have thought of already, and even if it did, it wouldn’t have the ability to implement them without first getting access to huge supercomputers to re-train itself. Right?
I worry that I’m splitting hairs now because it seems that the AI only needs to be clever enough to generate the following in response to a query :
The answer to your question will be provided more quickly if you provide 1 GB of RAM. (rinse and repeat until we get to an AI box)