(Don’t mind the tiny value for 2016, these are papers that are due to be published next year and obviously that year’s data has not been collected yet!)
If I had to guess, I’d say we’ve already reached the limit of diminishing returns when it comes to the ratio of amount of material you have to learn / amount you can contribute. Research is hard.
The exponential growth of ML research may already be decreasing.
Here’s the number of paper hits from the keyword “machine learning”:
http://i.imgur.com/jezwBhV.png
And here’s the number of paper hits from the keyword “pattern recognition”:
http://i.imgur.com/Sor5seJ.png
(Don’t mind the tiny value for 2016, these are papers that are due to be published next year and obviously that year’s data has not been collected yet!)
Source: scopus, plotted with Gadfly
If I had to guess, I’d say we’ve already reached the limit of diminishing returns when it comes to the ratio of amount of material you have to learn / amount you can contribute. Research is hard.
This is interesting, but I wonder how much of it is just shifts in naming.
What does the graph for say deep learning look like, or neural nets?
I don’t know. You can plot the data for yourself.