We ran into a hardware shortage during a period of time where there was no pause, which is evidence that the hardware manufacturer was behaving conservatively.
Alternative hypothesis, there are physical limits on how fast you can build things.
Also, NVIDIA currently has a monopoly on “decent AI accelerator you can actually buy”. Part of the “shortage” is just the standard economic result that a monopoly produces less of something to increase profits.
This monopoly will not last forever, so in that sense we are currently in hardware “underhang”.
This and the rest of your comment seems to have ignored the rest of my post (see: multiple inputs to progress, all of which seem sensitive to “demand”
Nvidia doesn’t just make AGI accelerators. They are are video game graphics card company.
And even if we pause large training runs, demand for inference of existing models will continue to increase.
If you think my model of how inputs to capabilities progress are sensitive to demand for those inputs from AGI labs is wrong, then please argue so directly, or explain how your proposed scenario is compatible with it.
This is me arguing directly.
The model “all demand for hardware is driven by a handful of labs training cutting edge models” is completely implausible. It doesn’t explain how we got the hardware in the first place (video games) and it ignores the fact that there exist uses for AI acceleration hardware other than training cutting-edge models.
Alternative hypothesis, there are physical limits on how fast you can build things.
Also, NVIDIA currently has a monopoly on “decent AI accelerator you can actually buy”. Part of the “shortage” is just the standard economic result that a monopoly produces less of something to increase profits.
This monopoly will not last forever, so in that sense we are currently in hardware “underhang”.
Nvidia doesn’t just make AGI accelerators. They are are video game graphics card company.
And even if we pause large training runs, demand for inference of existing models will continue to increase.
This is me arguing directly.
The model “all demand for hardware is driven by a handful of labs training cutting edge models” is completely implausible. It doesn’t explain how we got the hardware in the first place (video games) and it ignores the fact that there exist uses for AI acceleration hardware other than training cutting-edge models.