It is a thing that I remember having been said at podcasts, but I don’t remember which one, and there is a chance that it was never said in the sense I interpreted it.
“DeepMind says that at large quantities of compute the scaling laws bend slightly, and the optimal behavior might be to scale data by even more than you scale model size. In which case you might need to increase compute by more than 200x before it would make sense to use a trillion parameters.”
That was quite a while ago, and is not a very strongly worded claim. I think there was also evidence that Chinchilla got a constant factor wrong and people kept discovering that you wanted a substantially larger multiplier of data:parameter, which might fully account for any ‘slight bending’ back then—bending often just means you got a hyperparameter wrong and need to tune it better. (It’s a lot easier to break scaling than to improve it, so being away badly is not too interesting while bending the opposite direction is much more interesting.)
I’ve not seen the claim that the scaling laws are bending. Where should I look?
It is a thing that I remember having been said at podcasts, but I don’t remember which one, and there is a chance that it was never said in the sense I interpreted it.
Also, quote from this post:
“DeepMind says that at large quantities of compute the scaling laws bend slightly, and the optimal behavior might be to scale data by even more than you scale model size. In which case you might need to increase compute by more than 200x before it would make sense to use a trillion parameters.”
That was quite a while ago, and is not a very strongly worded claim. I think there was also evidence that Chinchilla got a constant factor wrong and people kept discovering that you wanted a substantially larger multiplier of data:parameter, which might fully account for any ‘slight bending’ back then—bending often just means you got a hyperparameter wrong and need to tune it better. (It’s a lot easier to break scaling than to improve it, so being away badly is not too interesting while bending the opposite direction is much more interesting.)