4 seems important to me. I wouldn’t expect intelligence to come via that route, but that route does seem to put a fairly credible (e.g. I would bet 4:1 on claims that credible and expect to win in the long term), though high, soft upper bound to how long we can go on with roughly current rate scientific progress without achieving AI. I’d say that it suggests such a soft uupper bound in the 2070s. That said, I wouldn’t be at all surprised by science ceasing to advance at something like the current rate long before then, accelerating or decelerating a lot even without a singularity.
4 seems important to me. I wouldn’t expect intelligence to come via that route, but that route does seem to put a fairly credible (e.g. I would bet 4:1 on claims that credible and expect to win in the long term), though high, soft upper bound to how long we can go on with roughly current rate scientific progress without achieving AI. I’d say that it suggests such a soft uupper bound in the 2070s. That said, I wouldn’t be at all surprised by science ceasing to advance at something like the current rate long before then, accelerating or decelerating a lot even without a singularity.