We basically lumped the reduced cost of FLOP per $ and increased spending together.
A report from CSET on AI and Compute projects the costs by using two strongly simplified assumptions: (I) doubling every 3.4 months (based on OpenAI’s previous report) and (II) computing cost stays constant. This could give you some ideas on rather upper bounds of projected costs.
[..] while the cost per unit of computation is decreasing by an order of magnitude every 4-12 years (the long-run trend has improved costs by 10x every 4 years, whereas recent trends have improved costs by 10x every 12 years).
Interesting, thanks. 10x reduction in cost every 4 years is roughly twice what I would have expected. But it sounds quite plausible especially considering AI accelerators and ASICs.
We basically lumped the reduced cost of FLOP per $ and increased spending together.
A report from CSET on AI and Compute projects the costs by using two strongly simplified assumptions: (I) doubling every 3.4 months (based on OpenAI’s previous report) and (II) computing cost stays constant. This could give you some ideas on rather upper bounds of projected costs.
Carey’s previous analysis uses this dataset from AI Impacts and therefore assumes:
Interesting, thanks. 10x reduction in cost every 4 years is roughly twice what I would have expected. But it sounds quite plausible especially considering AI accelerators and ASICs.