You are right about the singularity, but the underlying trend extrapolation is that of technical progress and, specifically, of software getting smarter.
Nowadays people got used to rapid technical progress and often consider it, um, inevitable. A look at history should disabuse one of that notion, though.
Yudkowsky explicitly doesn’t believe in rapid technical progress. He’s talked about the fact that he believes in the Great Stagnation (slowdown in science/tech/economic progress) which is possibly a good thing since it may retard the creation of AGI, giving people a better shot to work on friendliness first.
I don’t believe that prediction is based on trend-extrapolation. Nothing like that has ever happened, so there’s no trend to draw from.
You are right about the singularity, but the underlying trend extrapolation is that of technical progress and, specifically, of software getting smarter.
Nowadays people got used to rapid technical progress and often consider it, um, inevitable. A look at history should disabuse one of that notion, though.
Yudkowsky explicitly doesn’t believe in rapid technical progress. He’s talked about the fact that he believes in the Great Stagnation (slowdown in science/tech/economic progress) which is possibly a good thing since it may retard the creation of AGI, giving people a better shot to work on friendliness first.
Links? What is “rapid”? Did he look at his phone recently?
The Great Stagnation is phenomenon on the time scale of decades. How about the time scale of centuries?
Here is one: https://www.facebook.com/yudkowsky/posts/10152586485749228 .
Yes, he believes in the Great Stagnation. That does not imply he doesn’t believe in rapid technological progress. Again, what is “rapid”?