Before 2012, it’s somewhat notable that AlexNet wasn’t published yet.
TBC, I think people savvy enough about AI should have predicted that ML was a pretty plausible path and that “lots of compute” was also plausible. (But it’s unclear if they should have put lots of probability on this with the information available in 2010.)
I am more pointing out that they seemed to tacitly assume that deep learning/ML/scaling couldn’t work, since all the real work was what we would call better algorithms, and compute was not viewed as a bottleneck at all.
Before 2012, it’s somewhat notable that AlexNet wasn’t published yet.
TBC, I think people savvy enough about AI should have predicted that ML was a pretty plausible path and that “lots of compute” was also plausible. (But it’s unclear if they should have put lots of probability on this with the information available in 2010.)
I am more pointing out that they seemed to tacitly assume that deep learning/ML/scaling couldn’t work, since all the real work was what we would call better algorithms, and compute was not viewed as a bottleneck at all.