I. If superhuman AI systems are built, any given system is likely to be ‘goal-directed’
I think in its roots, AGI should have survival instinct as a goal. Everything else should be secondary. Its a hard choice, but if we want AGI to be like us, we have to follow that route. If its roots are different from ours, it will be close to impossible to replicate our behavior and our values.
I think in its roots, AGI should have survival instinct as a goal. Everything else should be secondary. Its a hard choice, but if we want AGI to be like us, we have to follow that route. If its roots are different from ours, it will be close to impossible to replicate our behavior and our values.