Ah, I’m pretty sure Eliezer has shorter timelines than you. He’s been cagy about it but he sure acts like it, and various of his public statements seem to suggest it. I can try to dig them up if you like.
If “now” means “When AI is having ~$1b/year of impact,” and “AGI” means “AI that can do anything a human can do better” then yes, I think that’s roughly what I’m saying.
Yep that’s one way of putting what I said yeah. My model of EY’s view is: Pre-AGI systems will ramp up in revenue & impact at some rate, perhaps the rate that they have ramped up so far. Then at some point we’ll actually hit AGI (or seed AGI) and then FOOM. And that point MIGHT happen later, when AGI is already a ten-trillion-dollar industry, but it’ll probably happen before then. So… I definitely wasn’t interpreting Yudkowsky in the longer-timelines way. His view did imply that maybe nothing super transformative would happen in the run-up to AGI, but not because pre-AGI systems are weak, rather because there just won’t be enough time for them to transform things before AGI comes.
Ah, I’m pretty sure Eliezer has shorter timelines than you. He’s been cagy about it but he sure acts like it, and various of his public statements seem to suggest it. I can try to dig them up if you like.
Yep that’s one way of putting what I said yeah. My model of EY’s view is: Pre-AGI systems will ramp up in revenue & impact at some rate, perhaps the rate that they have ramped up so far. Then at some point we’ll actually hit AGI (or seed AGI) and then FOOM. And that point MIGHT happen later, when AGI is already a ten-trillion-dollar industry, but it’ll probably happen before then. So… I definitely wasn’t interpreting Yudkowsky in the longer-timelines way. His view did imply that maybe nothing super transformative would happen in the run-up to AGI, but not because pre-AGI systems are weak, rather because there just won’t be enough time for them to transform things before AGI comes.
Anyhow, I’ll stop trying to speak for him.