Thanks for sharing! That’s a pretty sophisticated modeling function but it makes sense. I personally think Moore’s law (the FLOPS/$ version) will continue, but I know there’s a lot of skepticism about that.
Could you make another graph like Fig 4 but showing projected cost, using Moore’s law to estimate cost? The cost is going to be a lot, right?
We basically lumped the reduced cost of FLOP per $ and increased spending together.
A report from CSET on AI and Compute projects the costs by using two strongly simplified assumptions: (I) doubling every 3.4 months (based on OpenAI’s previous report) and (II) computing cost stays constant. This could give you some ideas on rather upper bounds of projected costs.
[..] while the cost per unit of computation is decreasing by an order of magnitude every 4-12 years (the long-run trend has improved costs by 10x every 4 years, whereas recent trends have improved costs by 10x every 12 years).
Interesting, thanks. 10x reduction in cost every 4 years is roughly twice what I would have expected. But it sounds quite plausible especially considering AI accelerators and ASICs.
Thanks for sharing! That’s a pretty sophisticated modeling function but it makes sense. I personally think Moore’s law (the FLOPS/$ version) will continue, but I know there’s a lot of skepticism about that.
Could you make another graph like Fig 4 but showing projected cost, using Moore’s law to estimate cost? The cost is going to be a lot, right?
Thanks!
Good idea. I might do this when I get the time—will let you know!
We basically lumped the reduced cost of FLOP per $ and increased spending together.
A report from CSET on AI and Compute projects the costs by using two strongly simplified assumptions: (I) doubling every 3.4 months (based on OpenAI’s previous report) and (II) computing cost stays constant. This could give you some ideas on rather upper bounds of projected costs.
Carey’s previous analysis uses this dataset from AI Impacts and therefore assumes:
Interesting, thanks. 10x reduction in cost every 4 years is roughly twice what I would have expected. But it sounds quite plausible especially considering AI accelerators and ASICs.