[Carl Shulman] is assuming normality where he shouldn’t, and this is one of the key places for that. It is a vision of AGI without ASI
This is valid exploratory engineering, which assumes some capabilities and considers what can be done with at least those capabilities. There is no implication that this is what will be done, or that capabilities won’t be much greater. We can still conclude that what can be done with merely these capabilities will remain an option given greater capabilities. Forecasting of optionality, not of actuality.
This is valid exploratory engineering, which assumes some capabilities and considers what can be done with at least those capabilities. There is no implication that this is what will be done, or that capabilities won’t be much greater. We can still conclude that what can be done with merely these capabilities will remain an option given greater capabilities. Forecasting of optionality, not of actuality.