How do you know this? Have there been a lot of findings made by a lot of people without any indication that this stream of discoveries is slowing down? When I looked up e.g. Deep learning it seemed to be a relatively old technique (1980′s and early 90′s). What are some examples of recent discoveries you would describe as low-hanging fruits?
It’s worth noting that deep learning has made a huge resurgence lately, and is seeing applications all over the place.
There’s tons of active work in online learning, especially under resource constraints.
Structured prediction is older but still an active and important area of research.
Spectral learning / method of moments is a relatively new technique that seems very promising.
Conditional gradient techniques for optimization have had a lot of interest recently, although that may slow down in the next couple years. Similarly for submodular optimization.
There are many other topics that I think are important but haven’t been quite as stylish lately; e.g. improved MCMC algorithms, coarse-to-fine inference / cascades, dual decomposition techniques for inference.
How do you know this? Have there been a lot of findings made by a lot of people without any indication that this stream of discoveries is slowing down? When I looked up e.g. Deep learning it seemed to be a relatively old technique (1980′s and early 90′s). What are some examples of recent discoveries you would describe as low-hanging fruits?
It’s worth noting that deep learning has made a huge resurgence lately, and is seeing applications all over the place.
There’s tons of active work in online learning, especially under resource constraints.
Structured prediction is older but still an active and important area of research.
Spectral learning / method of moments is a relatively new technique that seems very promising.
Conditional gradient techniques for optimization have had a lot of interest recently, although that may slow down in the next couple years. Similarly for submodular optimization.
There are many other topics that I think are important but haven’t been quite as stylish lately; e.g. improved MCMC algorithms, coarse-to-fine inference / cascades, dual decomposition techniques for inference.