I’m not sure if you’ve seen it or not, but here’s a relevant clip where he mentions that they aren’t training GPT-5. I don’t quite know how to update from it. It doesn’t seem likely that they paused from a desire to conduct more safety work, but I would also be surprised if somehow they are reaching some sort of performance limit from model size.
The expectation is that GPT-5 would be the next GPT-N but 100x the training compute of GPT-4, but that would probably cost tens of $billions, so GPT-N scaling is over for now.
I’m not sure if you’ve seen it or not, but here’s a relevant clip where he mentions that they aren’t training GPT-5. I don’t quite know how to update from it. It doesn’t seem likely that they paused from a desire to conduct more safety work, but I would also be surprised if somehow they are reaching some sort of performance limit from model size.
However, as Zvi mentions, Sam did say:
The expectation is that GPT-5 would be the next GPT-N but 100x the training compute of GPT-4, but that would probably cost tens of $billions, so GPT-N scaling is over for now.