1) I expect to see AI with human-level thought but 100x as slow as you or I first. Moore’s law will probably run out sooner than we get AI, and these days Moore’s law is giving us more cores, not faster ones.
If we indeed find no algorithm that runs drastically faster than the brain, Moore’s law shifting to more cores won’t be a problem because the brain is inherently parallelizable.
I think we just mean different things by “human level”—I wouldn’t consider “human level” thought running at 1/5th the speed of a human or slower to actually be “human level”. You wouldn’t really be able to have a conversation with such a thing.
And as Gurkenglas points out, the human brain is massively parallel—more cores instead of faster cores is actually desirable for this problem.
1) I expect to see AI with human-level thought but 100x as slow as you or I first. Moore’s law will probably run out sooner than we get AI, and these days Moore’s law is giving us more cores, not faster ones.
If we indeed find no algorithm that runs drastically faster than the brain, Moore’s law shifting to more cores won’t be a problem because the brain is inherently parallelizable.
I think we just mean different things by “human level”—I wouldn’t consider “human level” thought running at 1/5th the speed of a human or slower to actually be “human level”. You wouldn’t really be able to have a conversation with such a thing.
And as Gurkenglas points out, the human brain is massively parallel—more cores instead of faster cores is actually desirable for this problem.