Training runs already take months.
I’d expect that to take several generations of models, so double digit numbers of months in an aggressive scenario?
(Barring drastic jumps in compute that cut months long training runs to hours/days).
Read paragraph 2
But yes foom wasn’t going to happen. It takes time for ai to be improved, turns out reality gets a vote.
Training runs already take months.
I’d expect that to take several generations of models, so double digit numbers of months in an aggressive scenario?
(Barring drastic jumps in compute that cut months long training runs to hours/days).
Read paragraph 2
But yes foom wasn’t going to happen. It takes time for ai to be improved, turns out reality gets a vote.