How do you plan to fix the bugs in its bug-fixing ability, before the bug-fixing ability is applied to fixing bugs in the “don’t kill everyone” routine? ;-)
More to the point, how do you know that you and the machine have the same definition of “bug”? That seems to me like the fundamental danger of self-improving AGI: if you don’t agree with it on what counts as a “bug”, then you’re screwed.
(Relevant SF example: a short story in which the AI ship—also the story’s narrator—explains how she corrected her creator’s all-too-human error: he said their goal was to reach the stars, and yet for some reason, he set their course to land on a planet. Silly human!)
What about a “controlled ascent”?
How would that be the default case, if you’re explicitly taking precautions?
It seems as though you don’t have any references for the supposed “hubris verging on sheer insanity”. Maybe people didn’t think that in the first place.
Computers regularly detect and fix bugs today—e.g. check out Eclipse.
I never claimed “controlled ascent” as being “the default case”. In fact I am here criticising “the default case” as weasel wording.
How do you plan to fix the bugs in its bug-fixing ability, before the bug-fixing ability is applied to fixing bugs in the “don’t kill everyone” routine? ;-)
More to the point, how do you know that you and the machine have the same definition of “bug”? That seems to me like the fundamental danger of self-improving AGI: if you don’t agree with it on what counts as a “bug”, then you’re screwed.
(Relevant SF example: a short story in which the AI ship—also the story’s narrator—explains how she corrected her creator’s all-too-human error: he said their goal was to reach the stars, and yet for some reason, he set their course to land on a planet. Silly human!)
How would that be the default case, if you’re explicitly taking precautions?
Controlled ascent isn’t the default case, but it certainly should be what provably friendly AI is weighed against.
It seems as though you don’t have any references for the supposed “hubris verging on sheer insanity”. Maybe people didn’t think that in the first place.
Computers regularly detect and fix bugs today—e.g. check out Eclipse.
I never claimed “controlled ascent” as being “the default case”. In fact I am here criticising “the default case” as weasel wording.