If that’s all we knew these about two technologies, then I would agree that the second is likely to come later, if at all. But it’s not.
Consider two technologies, in one, experts consistently underestimate how long progress will take and it always ends being far more difficult and complex than they expect. In the second, experts consistently overestimate how long progress will take, and it often ends up being far easier and simpler than they expected. In this case you would expect the second one (which is AI*) to come first.
I don’t understand either field enough to predict which will come first, but to do that you’d need either a more detailed understanding, instead of a simple argument that takes only a few parts into account, or have one very strong argument, which I don’t think this post makes.
*I know you talked about AGI and I also take narrow AI into account, I think that’s reasonable given the field is younger and is seeing a huge explosion of progress lately.
In the second, experts consistently overestimate how long progress will take
This doesn’t seem like a fair characterization of AI. People have been predicting we could build machines that “think like humans” at least since Charles Babbage and they are all pretty consistently overoptimistic.
but to do that you’d need either a more detailed understanding
My point is precisely that we do have a detailed understanding of what it takes to build a fusion reactor, and it is still (at least) 15 years away.
If that’s all we knew these about two technologies, then I would agree that the second is likely to come later, if at all. But it’s not.
Consider two technologies, in one, experts consistently underestimate how long progress will take and it always ends being far more difficult and complex than they expect. In the second, experts consistently overestimate how long progress will take, and it often ends up being far easier and simpler than they expected. In this case you would expect the second one (which is AI*) to come first.
I don’t understand either field enough to predict which will come first, but to do that you’d need either a more detailed understanding, instead of a simple argument that takes only a few parts into account, or have one very strong argument, which I don’t think this post makes.
*I know you talked about AGI and I also take narrow AI into account, I think that’s reasonable given the field is younger and is seeing a huge explosion of progress lately.
This doesn’t seem like a fair characterization of AI. People have been predicting we could build machines that “think like humans” at least since Charles Babbage and they are all pretty consistently overoptimistic.
My point is precisely that we do have a detailed understanding of what it takes to build a fusion reactor, and it is still (at least) 15 years away.