The pattern is the same across the entire software industry, not just AI research.
Only a small portion of real progress comes from professors and Phd. Per person they tend to do pretty well in terms of innovation but it’s hard to beat a million obsessed geeks willing and able to spend every hour of their free time experimenting with something.
The people working in the olden days weren’t just working with slower computers, a lot of the time they were working with buggy, crappier languages, feature-poor debuggers and no IDE’s.
A comp sci undergrad student working with a modern language in a modern IDE with modern debuggers can whip up in hours what it would have taken phd’s weeks to do back in the early days and it’s not all just hardware.
Don’t get me wrong: Hardware helps, having cycles to burn and so much memory that you don’t have to care about wasting it also saves you time but you get a massive feedback loop where the more people there are in your environment doing similar things the more you can focus on the novel, important parts of your work rather than fucking around trying to find where you set a pointer incorrectly or screwed up a JUMP.
Very few people have access to supercomputers, if they do then they aren’t going to be spending their supercomputer time going “well that didn’t work but what if I tried this slight variation..”x100
Everyone has access to desktops so as soon as something can run on consumer electronics thousands of people can suddenly spend all night experimenting.
Even if the home experimentation doesn’t yield the results you now have a generation of teenagers who’ve spent time thinking about the problem and have experience of thinking in the right terms at a young age and are primed to gain a far deeper understanding once they hit college age.
The pattern is the same across the entire software industry, not just AI research.
Only a small portion of real progress comes from professors and Phd. Per person they tend to do pretty well in terms of innovation but it’s hard to beat a million obsessed geeks willing and able to spend every hour of their free time experimenting with something.
The people working in the olden days weren’t just working with slower computers, a lot of the time they were working with buggy, crappier languages, feature-poor debuggers and no IDE’s.
A comp sci undergrad student working with a modern language in a modern IDE with modern debuggers can whip up in hours what it would have taken phd’s weeks to do back in the early days and it’s not all just hardware.
Don’t get me wrong: Hardware helps, having cycles to burn and so much memory that you don’t have to care about wasting it also saves you time but you get a massive feedback loop where the more people there are in your environment doing similar things the more you can focus on the novel, important parts of your work rather than fucking around trying to find where you set a pointer incorrectly or screwed up a JUMP.
Very few people have access to supercomputers, if they do then they aren’t going to be spending their supercomputer time going “well that didn’t work but what if I tried this slight variation..”x100
Everyone has access to desktops so as soon as something can run on consumer electronics thousands of people can suddenly spend all night experimenting.
Even if the home experimentation doesn’t yield the results you now have a generation of teenagers who’ve spent time thinking about the problem and have experience of thinking in the right terms at a young age and are primed to gain a far deeper understanding once they hit college age.