Kudos for not naively extrapolating past 100% of GDP invested into AI research.
Reduced Cost of Computation: Estimated to reduce by 50% every 2.5 years (about in-line with current trends), down to a minimum level of 1 / 106 (i.e., 0.0001%) in 50 years.
Increased Availability of Capital for AI: Estimated to reach a level of $1B in 2025, then double every 2 years after that, up to 1% of US GDP (currently would suggest $200B of available capital, and growing ~3% per year).
Our current trends in cost of computation are in combination with (seemingly) exponentially-increasing investment. If some or all of the reduced cost of computation is driven by increasing investment, using these numbers you’d expect a knee in the computation cost curve by about 2040[1] or so.
There are also other limits here. For instance: since 2010 DRAM bits/$ has been rising by ~18%[4] per year on average[2]. This is significant, but not the 32%[5]/year that 2x every 2.5 years would imply. For now, DRAM cost hasn’t been the limiting factor… but with continued exponential scaling?
Estimated to reduce computation required by 50% every 2-3 years based on observed recent progress
I’d be careful extrapolating here. Compilers also made major progress in ‘early’ days, but have somewhere between a 2-5 decade output-efficiency doubling time at this point[3][6].
I also wonder how much of recent progress has been driven by increased investment. Do you have numbers on this?
Proebsting’s Law is an observation that compilers roughly double the performance of the output program, all else being equal, with an 18-year doubling time. The 2001 reproduction suggested more like 20 years under optimistic assumptions. A 2022 informal test showed a 10-15% improvement on average in the last 10 years, which is closer to a 50-year doubling time.
Kudos for not naively extrapolating past 100% of GDP invested into AI research.
Our current trends in cost of computation are in combination with (seemingly) exponentially-increasing investment. If some or all of the reduced cost of computation is driven by increasing investment, using these numbers you’d expect a knee in the computation cost curve by about 2040[1] or so.
There are also other limits here. For instance: since 2010 DRAM bits/$ has been rising by ~18%[4] per year on average[2]. This is significant, but not the 32%[5]/year that 2x every 2.5 years would imply. For now, DRAM cost hasn’t been the limiting factor… but with continued exponential scaling?
I’d be careful extrapolating here. Compilers also made major progress in ‘early’ days, but have somewhere between a 2-5 decade output-efficiency doubling time at this point[3][6].
I also wonder how much of recent progress has been driven by increased investment. Do you have numbers on this?
2025+2∗log2($200B/1$B)≈2040. Actually slightly later, because GDP does grow somewhat over that time.
https://aiimpacts.org/trends-in-dram-price-per-gigabyte/ → “Since 2010, the price has fallen much more slowly, at a rate that would yield an order of magnitude over roughly 14 years.”
Proebsting’s Law is an observation that compilers roughly double the performance of the output program, all else being equal, with an 18-year doubling time. The 2001 reproduction suggested more like 20 years under optimistic assumptions. A 2022 informal test showed a 10-15% improvement on average in the last 10 years, which is closer to a 50-year doubling time.
x14=10 → x≈1.1788
x2.5=2 → x≈1.3195
Personally, I think most seemingly-exponential curves are subexponential, but that’s another matter.