Suggestion to test your theory: Look at the best AI results of the last 2 years and try to run them / test them in a reasonable time on a computer that was affordable 10 years ago.
My own opinion is that hardware capacity has been a huge constraint in the past. We are moving into an era where it is less of a problem. But, I think, still a problem. Hardware limitations infect and limit your thinking in all sorts of ways and slow you down terribly.
To take an example from my own work. I have a problem that needs about 50Gb RAM to test efficiently. Otherwise it does not fit in memory and the run time is 100X slower.
I had the option to spend 6 months maybe finding a way to squeeze it into 32Gb. Or, what I did: spend a few thousand on a machine with 128Gb RAM. To run in 1Gb RAM would have been a world of pain, maybe not doable in the time I have to work on it.
Suggestion to test your theory: Look at the best AI results of the last 2 years and try to run them / test them in a reasonable time on a computer that was affordable 10 years ago.
My own opinion is that hardware capacity has been a huge constraint in the past. We are moving into an era where it is less of a problem. But, I think, still a problem. Hardware limitations infect and limit your thinking in all sorts of ways and slow you down terribly.
To take an example from my own work. I have a problem that needs about 50Gb RAM to test efficiently. Otherwise it does not fit in memory and the run time is 100X slower.
I had the option to spend 6 months maybe finding a way to squeeze it into 32Gb. Or, what I did: spend a few thousand on a machine with 128Gb RAM. To run in 1Gb RAM would have been a world of pain, maybe not doable in the time I have to work on it.