This was really interesting to read. I’m still pretty new to the AI space so I don’t know how this compares to our current FLOP usage. Assuming our current course of computing power doesn’t change, how long is the timeline to get to 10^34 FLOP of computing power?
Well, it’s 12 OOMs more than our current FLOP usage. ;) Ajeya’s report is excellent and contains the best and most thorough answers to your questions I think. If I recall correctly, she projects that the price of compute will drop in half every 2.5 years on average, and that algorithmic/efficiency improvements will have a similar effect. And then there’s a one-time boost of up to +5 OOMs that we get from just spending a lot more. So she projects we get to (the equivalent of) +12 OOMs around 2050. Personally, I think the price of compute will drop a bit faster in the next ten years, and the algorithmic improvements will be a bit better too in the near term. But I’m very uncertain about that and mostly just defer to her judgment.
This was really interesting to read. I’m still pretty new to the AI space so I don’t know how this compares to our current FLOP usage. Assuming our current course of computing power doesn’t change, how long is the timeline to get to 10^34 FLOP of computing power?
Well, it’s 12 OOMs more than our current FLOP usage. ;) Ajeya’s report is excellent and contains the best and most thorough answers to your questions I think. If I recall correctly, she projects that the price of compute will drop in half every 2.5 years on average, and that algorithmic/efficiency improvements will have a similar effect. And then there’s a one-time boost of up to +5 OOMs that we get from just spending a lot more. So she projects we get to (the equivalent of) +12 OOMs around 2050. Personally, I think the price of compute will drop a bit faster in the next ten years, and the algorithmic improvements will be a bit better too in the near term. But I’m very uncertain about that and mostly just defer to her judgment.
Awesome! Thanks for your answer!