Algorithmic progress on ImageNet seems to effectively halve compute requirements every 4 to 25 months (Erdil and Besiroglu 2022); assume that the doubling time is 50% longer for transformers.[7]
Pretty sure several recent advances have proven this assumption to be very wrong indeed.
Plus the stuff with Alpaca (Llama tuned using GPT assistance).
That’s assuming we don’t get an algorithmic breakthrough that improves compute-cost-per-capability-level substantially over current transformers. I think that’s likely to happen in the next couple years.
Plus there are hardware developments current in progress, like Nvidia’s work and Fathom Radiant and others, which could lead to AI-specific hardware improvements out of step with Moore’s law.
I’m willing to make bets against AI Winter before TAI, if anyone has a specific bet to propose...
Thanks! Those papers are new to me; I’ll have a look.
I’m willing to make bets against AI Winter before TAI, if anyone has a specific bet to propose...
I just want to call attention to the fact that my operationalisation (“a drawdown in annual global AI investment of ≥50%”) is pretty inclusive (maybe too much so). I can imagine some scenarios where this happens and then we get TAI within 5 years after that anyway, or where this happens but it doesn’t really look like a winter.
(Partly I did this to be more “charitable” to Eden—to say, “AI winter seems pretty unlikely even on these pretty conservative assumptions”, but I should probably have flagged the fact that “≥50% drawdown” is more inclusive than “winter” more clearly.)
Is that a typo? That’s such a broad range that the statistic is completely useless. Halving every 4 months is over 32 times as significant as halving every 25 months. That’s completely different worlds.
Pretty sure several recent advances have proven this assumption to be very wrong indeed.
For example:
https://arxiv.org/abs/2303.06865
https://arxiv.org/abs/2208.07339
Plus the stuff with Alpaca (Llama tuned using GPT assistance).
That’s assuming we don’t get an algorithmic breakthrough that improves compute-cost-per-capability-level substantially over current transformers. I think that’s likely to happen in the next couple years.
Plus there are hardware developments current in progress, like Nvidia’s work and Fathom Radiant and others, which could lead to AI-specific hardware improvements out of step with Moore’s law.
I’m willing to make bets against AI Winter before TAI, if anyone has a specific bet to propose...
Thanks! Those papers are new to me; I’ll have a look.
I just want to call attention to the fact that my operationalisation (“a drawdown in annual global AI investment of ≥50%”) is pretty inclusive (maybe too much so). I can imagine some scenarios where this happens and then we get TAI within 5 years after that anyway, or where this happens but it doesn’t really look like a winter.
(Partly I did this to be more “charitable” to Eden—to say, “AI winter seems pretty unlikely even on these pretty conservative assumptions”, but I should probably have flagged the fact that “≥50% drawdown” is more inclusive than “winter” more clearly.)
Yeah, your steelman seems pretty reasonable to me actually. My comment was more a reaction to Eden’s original stance. Bonus content… Here’s a related pithy tidbit: https://youtube.com/clip/UgkxGJnic8Q7W59Vo1C4TdnwIugx5FaqDpN_
Is that a typo? That’s such a broad range that the statistic is completely useless. Halving every 4 months is over 32 times as significant as halving every 25 months. That’s completely different worlds.