There is only one simple requirement for any AI to begin recursive self-improvement: Learning of the theoretical possibility that more powerful or efficient algorithms, preferably with even more brainpower, could achieve the AI’s goals or raise its utility levels faster than what it’s currently doing.
Going from there to “Let’s create a better version of myself because I’m the current most optimal algorithm I know of” isn’t such a huge step to make as some people seem to implicitly believe, as long as the AI can infer its own existence or is self-aware in any manner.
Hence my second paragraph: Goals are inherently dangerous things to give AIs. Especially open-ended goals which would require an ever-better intelligence to resolve.
AIs that can’t be described by attributing goals to them don’t really seem too powerful (after all, intelligence is about making the world going into some direction; this is the only property that tells apart an AGI from a rock).
There is only one simple requirement for any AI to begin recursive self-improvement: Learning of the theoretical possibility that more powerful or efficient algorithms, preferably with even more brainpower, could achieve the AI’s goals or raise its utility levels faster than what it’s currently doing.
Going from there to “Let’s create a better version of myself because I’m the current most optimal algorithm I know of” isn’t such a huge step to make as some people seem to implicitly believe, as long as the AI can infer its own existence or is self-aware in any manner.
Hence my second paragraph: Goals are inherently dangerous things to give AIs. Especially open-ended goals which would require an ever-better intelligence to resolve.
AIs that can’t be described by attributing goals to them don’t really seem too powerful (after all, intelligence is about making the world going into some direction; this is the only property that tells apart an AGI from a rock).
Evolution and capitalism are both non-goal-oriented, extremely powerful intelligences. Goals are only one form of motivators.