‘Let an ultraintelligent person be defined as a person who can far surpass all the intellectual activities of any other person however clever. Since the improvement of people is one of these intellectual activities, an ultraintelligent person could produce even better people; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of ordinary people would be left far behind. Thus the first ultraintelligent person is the last invention that people need ever make, provided that the person is docile enough to tell us how to keep them under control.′
Looks good to me, with the same set of caveats as the original claim. Though note that both arguments are bolstered if “improvement of people” or “design of machines” in the second sentence is replaced by a more exhaustive inventory. Would be good to think more about the differences.
This application highlights a problem in that definition, namely gains of specialization. Say you produced humans with superhuman general intelligence as measured by IQ tests, maybe the equivalent of 3 SD above von Neumann. Such a human still could not be an expert in each and every field of intellectual activity simultaneously due to time and storage constraints.
The superhuman could perhaps master any given field better than any human given some time for study and practice, but could not so master all of them without really ridiculously superhuman prowess. This overkill requirement is somewhat like the way a rigorous Turing Test requires not only humanlike reasoning, but tremendous ability to tell a coherent fake story about biographical details, etc.
For me, it “works” similarly to the original, but emphasizes (1) the underspecification of “far surpass”, and (2) that the creation of a greater intelligence may require resources (intellectual or otherwise) beyond those of the proposed ultraintelligent person, the way an ultraintelligent wasp may qualify as far superior in all intellectual endeavors to a typical wasp yet still remain unable to invent and build a simple computing machine, nevermind constructing a greater intelligence.
‘Let an ultraintelligent person be defined as a person who can far surpass all the intellectual activities of any other person however clever. Since the improvement of people is one of these intellectual activities, an ultraintelligent person could produce even better people; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of ordinary people would be left far behind. Thus the first ultraintelligent person is the last invention that people need ever make, provided that the person is docile enough to tell us how to keep them under control.′
Does this work?
Looks good to me, with the same set of caveats as the original claim. Though note that both arguments are bolstered if “improvement of people” or “design of machines” in the second sentence is replaced by a more exhaustive inventory. Would be good to think more about the differences.
What caveats are you thinking of?
This application highlights a problem in that definition, namely gains of specialization. Say you produced humans with superhuman general intelligence as measured by IQ tests, maybe the equivalent of 3 SD above von Neumann. Such a human still could not be an expert in each and every field of intellectual activity simultaneously due to time and storage constraints.
The superhuman could perhaps master any given field better than any human given some time for study and practice, but could not so master all of them without really ridiculously superhuman prowess. This overkill requirement is somewhat like the way a rigorous Turing Test requires not only humanlike reasoning, but tremendous ability to tell a coherent fake story about biographical details, etc.
For me, it “works” similarly to the original, but emphasizes (1) the underspecification of “far surpass”, and (2) that the creation of a greater intelligence may require resources (intellectual or otherwise) beyond those of the proposed ultraintelligent person, the way an ultraintelligent wasp may qualify as far superior in all intellectual endeavors to a typical wasp yet still remain unable to invent and build a simple computing machine, nevermind constructing a greater intelligence.