$100-200bn 5 GW training systems are now a go. So in the worlds that slow down for years if there are only $30bn systems available and would need an additional scaling push, timelines moved up a few years. Not sure how unlikely $100-200bn systems would’ve been without o1/o3, but they seem likely now.
In the same terms as the $100-200bn I’m talking about, o3 is probably about $1.5-5bn, meaning 30K-100K H100, the system needed to train GPT-4o or GPT-4.5o (or whatever they’ll call it) that it might be based on. But that’s the cost of a training system, its time needed for training is cheaper (since the rest of its time can be used for other things). In the other direction, it’s more expensive than just that time because of research experiments. If OpenAI spent $3bn in 2024 on training, this is probably mostly research experiments.
$100-200bn 5 GW training systems are now a go. So in the worlds that slow down for years if there are only $30bn systems available and would need an additional scaling push, timelines moved up a few years. Not sure how unlikely $100-200bn systems would’ve been without o1/o3, but they seem likely now.
What do you think is the current cost of o3, for comparison?
In the same terms as the $100-200bn I’m talking about, o3 is probably about $1.5-5bn, meaning 30K-100K H100, the system needed to train GPT-4o or GPT-4.5o (or whatever they’ll call it) that it might be based on. But that’s the cost of a training system, its time needed for training is cheaper (since the rest of its time can be used for other things). In the other direction, it’s more expensive than just that time because of research experiments. If OpenAI spent $3bn in 2024 on training, this is probably mostly research experiments.