If AGI will take longer than 100 years to become possible, “AI first” isn’t a relevant strategic option since an upload or IA driven Singularity will probably occur within that time frame even without any specific push from Singularitarians. So it seems reasonable to set a time horizon of 100 years at most.
Ah okay, so we’re talking about a “humans seem just barely smart enough to build a superintelligent UFAI within the next 100 years” intuition. Talking about that makes sense, and that intuition feels much more plausible to me.
I’d give it 150 years. Civilization might get a setback, actual implementation of fast-running uploads might be harder than it looks, and intelligence improvement might take too long to become an important force. Plans can fail.
If AGI will take longer than 100 years to become possible, “AI first” isn’t a relevant strategic option since an upload or IA driven Singularity will probably occur within that time frame even without any specific push from Singularitarians. So it seems reasonable to set a time horizon of 100 years at most.
Ah okay, so we’re talking about a “humans seem just barely smart enough to build a superintelligent UFAI within the next 100 years” intuition. Talking about that makes sense, and that intuition feels much more plausible to me.
I’d give it 150 years. Civilization might get a setback, actual implementation of fast-running uploads might be harder than it looks, and intelligence improvement might take too long to become an important force. Plans can fail.