Hm, yeah, I bet if I reflected more things would shift around, but I’m not sure the fact that there’s a shortish period where the per-year probability is very elevated followed by a longer period with lower per-year probability is actually a bad sign.
Roughly speaking, right now we’re in an AI boom where spending on compute for training big models is going up rapidly, and it’s fairly easy to actually increase spending quickly because the current levels are low. There’s some chance of transformative AI in the middle of this spending boom—and because resource inputs are going up a ton each year, the probability of TAI by date X would also be increasing pretty rapidly.
But the current spending boom is pretty unsustainable if it doesn’t lead to TAI. At some point in the 2040s or 50s, if we haven’t gotten transformative AI by then, we’ll have been spending 10s of billions training models, and it won’t be that easy to keep ramping up quickly from there. And then because the input growth will have slowed, the increase in probability from one year to the next will also slow. (That said, not sure how this works out exactly.)
(+1. I totally agree that input growth will slow sometime if we don’t get TAI soon. I just think you have to be pretty sure that it slows right around 2040 to have the specific numbers you mention, and smoothing out when it will slow down due to that uncertainty gives a smoother probability distribution for TAI.)
Hm, yeah, I bet if I reflected more things would shift around, but I’m not sure the fact that there’s a shortish period where the per-year probability is very elevated followed by a longer period with lower per-year probability is actually a bad sign.
Roughly speaking, right now we’re in an AI boom where spending on compute for training big models is going up rapidly, and it’s fairly easy to actually increase spending quickly because the current levels are low. There’s some chance of transformative AI in the middle of this spending boom—and because resource inputs are going up a ton each year, the probability of TAI by date X would also be increasing pretty rapidly.
But the current spending boom is pretty unsustainable if it doesn’t lead to TAI. At some point in the 2040s or 50s, if we haven’t gotten transformative AI by then, we’ll have been spending 10s of billions training models, and it won’t be that easy to keep ramping up quickly from there. And then because the input growth will have slowed, the increase in probability from one year to the next will also slow. (That said, not sure how this works out exactly.)
(+1. I totally agree that input growth will slow sometime if we don’t get TAI soon. I just think you have to be pretty sure that it slows right around 2040 to have the specific numbers you mention, and smoothing out when it will slow down due to that uncertainty gives a smoother probability distribution for TAI.)