Right now it matters at most to the very biggest handful of labs.
That sounds right, but it’s unclear to me how many companies would want to train 10^26 FLOP models in 2030.
I think still not very many, because training a model is a big industrial process, with major economies of scale and winner-take-most effects. It’s a place where specialization really makes sense. I guess that there will be less than 20 companies in the US that are training models of that size, and everyone else is licensing them / using them through the API.
But the Bill apparently makes a provision for that, in that the standards for what counts as a covered model change after 2027.
That sounds right, but it’s unclear to me how many companies would want to train 10^26 FLOP models in 2030.
I think still not very many, because training a model is a big industrial process, with major economies of scale and winner-take-most effects. It’s a place where specialization really makes sense. I guess that there will be less than 20 companies in the US that are training models of that size, and everyone else is licensing them / using them through the API.
But the Bill apparently makes a provision for that, in that the standards for what counts as a covered model change after 2027.