It’s worth noting that deep learning has made a huge resurgence lately, and is seeing applications all over the place.
There’s tons of active work in online learning, especially under resource constraints.
Structured prediction is older but still an active and important area of research.
Spectral learning / method of moments is a relatively new technique that seems very promising.
Conditional gradient techniques for optimization have had a lot of interest recently, although that may slow down in the next couple years. Similarly for submodular optimization.
There are many other topics that I think are important but haven’t been quite as stylish lately; e.g. improved MCMC algorithms, coarse-to-fine inference / cascades, dual decomposition techniques for inference.
It’s worth noting that deep learning has made a huge resurgence lately, and is seeing applications all over the place.
There’s tons of active work in online learning, especially under resource constraints.
Structured prediction is older but still an active and important area of research.
Spectral learning / method of moments is a relatively new technique that seems very promising.
Conditional gradient techniques for optimization have had a lot of interest recently, although that may slow down in the next couple years. Similarly for submodular optimization.
There are many other topics that I think are important but haven’t been quite as stylish lately; e.g. improved MCMC algorithms, coarse-to-fine inference / cascades, dual decomposition techniques for inference.