Wow, thank you so much! (And I apologize for the late reaction.) This is great, really.
Linear increase in performance with exponential investment does not a FOOM make.
Indeed. It seems FOOM might require a world in which processors can be made arbitrarily tiny.
I have been going over all your points and find them all very interesting. My current intuition on “recursive self-improvement” is that deep learning may be about the closest thing we can get, and that performance of those will asymptote relatively quickly when talking about general intelligence. As in, I don’t expect it’s impossible to have super-human deep learning systems, but I don’t expect with high probability that there will be an exponential trend smashing through the human level.
Wow, thank you so much! (And I apologize for the late reaction.) This is great, really.
Indeed. It seems FOOM might require a world in which processors can be made arbitrarily tiny.
I have been going over all your points and find them all very interesting. My current intuition on “recursive self-improvement” is that deep learning may be about the closest thing we can get, and that performance of those will asymptote relatively quickly when talking about general intelligence. As in, I don’t expect it’s impossible to have super-human deep learning systems, but I don’t expect with high probability that there will be an exponential trend smashing through the human level.