Compared to 2012, it now takes 44 times less compute to train a neural network to the level of AlexNet (by contrast, Moore’s Law would yield an 11x cost improvement over this period). Our results suggest that for AI tasks with high levels of recent investment, algorithmic progress has yielded more gains than classical hardware efficiency.
If 11x of the 44x total speedup is from hardware, doesn’t that leave just 4x from software?
Moore’s law simply means that the 44x less compute is 11x cheaper, right? Moore’s law doesn’t make algorithms need less compute, just lowers the cost of that compute.
If 11x of the 44x total speedup is from hardware, doesn’t that leave just 4x from software?
Moore’s law simply means that the 44x less compute is 11x cheaper, right? Moore’s law doesn’t make algorithms need less compute, just lowers the cost of that compute.
Makes sense.