There seems to be the implicit assumption that superhuman AI will be some sort of sudden absolute thing.
why?
If I were to guess I’d say that the most likely course is one of gradual improvement for quite some time, more similar to the development of airplanes than the development of the atomic bomb.
if you handed modern bombers and the tech to support them to one of the sides in the first world war then you can be sure they’ve have won pretty quickly. And there was investment in flight and it was useful. but early planes were slow, fault prone, terrible as weapons platforms etc.
We might very well see AI develop slowly with roadblocks every few years or decades which halt or slow advancement for a while until some sollution is found.
I guess it’s down to whether you assume that the difficulty of increasing intelligence is exponential or linear.
If each additional IQ point(for want of a better measure) gets harder to add than the last then even with a cycle of self improvement you’re not automatically going to get a god.
We might even see intelligence augmentation keeping pace with AI development for quite some time.
There seems to be the implicit assumption that superhuman AI will be some sort of sudden absolute thing.
why?
If I were to guess I’d say that the most likely course is one of gradual improvement for quite some time, more similar to the development of airplanes than the development of the atomic bomb.
if you handed modern bombers and the tech to support them to one of the sides in the first world war then you can be sure they’ve have won pretty quickly. And there was investment in flight and it was useful. but early planes were slow, fault prone, terrible as weapons platforms etc.
We might very well see AI develop slowly with roadblocks every few years or decades which halt or slow advancement for a while until some sollution is found.
I guess it’s down to whether you assume that the difficulty of increasing intelligence is exponential or linear.
If each additional IQ point(for want of a better measure) gets harder to add than the last then even with a cycle of self improvement you’re not automatically going to get a god.
We might even see intelligence augmentation keeping pace with AI development for quite some time.