This sounds roughly right to me. There is the FAI/UFAI threshold of technological development, and after humanity passes that threshold, it’s unlikely that coordination will be a key bottleneck in humanity’s future. I think many would disagree with this take, who think multi-polar worlds are more likely and that AGI systems may not cooperate well, but I think the view is roughly correct.
The main thing I’m pointing at in my post is 5) and 3)-transition-to-5). It seems quite possible to me that SAI will be out of reach for a while due to hardware development slowing, and that the application of other technologies could threaten humanity in the meantime.
This sounds roughly right to me. There is the FAI/UFAI threshold of technological development, and after humanity passes that threshold, it’s unlikely that coordination will be a key bottleneck in humanity’s future. I think many would disagree with this take, who think multi-polar worlds are more likely and that AGI systems may not cooperate well, but I think the view is roughly correct.
The main thing I’m pointing at in my post is 5) and 3)-transition-to-5). It seems quite possible to me that SAI will be out of reach for a while due to hardware development slowing, and that the application of other technologies could threaten humanity in the meantime.