I believe that by the time an AI has fully completed the transition to hard superintelligence
Nate, what is meant by “hard” superintelligence, and what would precede it? A “giant kludgey mess” that is nonetheless superintelligent? If you’ve previously written about this transition, I’d like to read more.
Maybe Nate has something in mind like Bostrom’s “strong superintelligence”, defined in Superintelligence as “a level of intelligence vastly greater than contemporary humanity’s combined intellectual wherewithal”?
(Whereas Bostrom defines “superintelligence” as “any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest”, where “exceeding the performance of humans” means outperforming the best individual human, not necessarily outperforming all of humanity working together.)
Nate, what is meant by “hard” superintelligence, and what would precede it? A “giant kludgey mess” that is nonetheless superintelligent? If you’ve previously written about this transition, I’d like to read more.
Maybe Nate has something in mind like Bostrom’s “strong superintelligence”, defined in Superintelligence as “a level of intelligence vastly greater than contemporary humanity’s combined intellectual wherewithal”?
(Whereas Bostrom defines “superintelligence” as “any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest”, where “exceeding the performance of humans” means outperforming the best individual human, not necessarily outperforming all of humanity working together.)