The big ML systems, like GPT-3 and MT-NLG, are taking increasingly large amounts of computing power to train. Normally, this requires expensive GPUs to do math simulating neural networks. But the human brain, better than any computer at a variety of tasks, is fairly small. Plus, it uses far less energy than huge GPU clusters!
As much as AI has progressed in the last few years, is it headed for an AI winter? Not with new hardware on the horizon. By mimicking the human brain, neuromorphic chips and new analog computers can do ML calculations faster than GPUs, at the expense of some precision. Since “human-y” tasks usually require more intuition than precision, this means AI can catch up to human performance in everything, not just precise number-crunching. Putting all this together, we get a disturbing picture: a mind far faster, smarter, and less fragile than a human brain, optimizing whatever function its programmers gave it.
The big ML systems, like GPT-3 and MT-NLG, are taking increasingly large amounts of computing power to train. Normally, this requires expensive GPUs to do math simulating neural networks. But the human brain, better than any computer at a variety of tasks, is fairly small. Plus, it uses far less energy than huge GPU clusters!
As much as AI has progressed in the last few years, is it headed for an AI winter? Not with new hardware on the horizon. By mimicking the human brain, neuromorphic chips and new analog computers can do ML calculations faster than GPUs, at the expense of some precision. Since “human-y” tasks usually require more intuition than precision, this means AI can catch up to human performance in everything, not just precise number-crunching. Putting all this together, we get a disturbing picture: a mind far faster, smarter, and less fragile than a human brain, optimizing whatever function its programmers gave it.