Trend extrapolation is more reasonable than invoking something that hasn’t happened at all yet, and then claiming, “When this happens, it will become an unstoppable trend.”
It would be more reasonable to use trend-extrapolation if it was a field where you would necessarily be able to discern a trend. Yudkowsky argues there could be sharp discontinuities. Personally I don’t really feel qualified to have a strong opinion, and I would not be able to discern a trend even if it exists.
You are right about the singularity, but the underlying trend extrapolation is that of technical progress and, specifically, of software getting smarter.
Nowadays people got used to rapid technical progress and often consider it, um, inevitable. A look at history should disabuse one of that notion, though.
Yudkowsky explicitly doesn’t believe in rapid technical progress. He’s talked about the fact that he believes in the Great Stagnation (slowdown in science/tech/economic progress) which is possibly a good thing since it may retard the creation of AGI, giving people a better shot to work on friendliness first.
EY warns against extrapolating current trends into the future. Seriously?
Why does that surprise you? None of EY’s positions seem to be dependent on trend-extrapolation.
Trend extrapolation is more reasonable than invoking something that hasn’t happened at all yet, and then claiming, “When this happens, it will become an unstoppable trend.”
It would be more reasonable to use trend-extrapolation if it was a field where you would necessarily be able to discern a trend. Yudkowsky argues there could be sharp discontinuities. Personally I don’t really feel qualified to have a strong opinion, and I would not be able to discern a trend even if it exists.
Other than a technological singularity with artificial intelligence explosion to a god-like level?
I don’t believe that prediction is based on trend-extrapolation. Nothing like that has ever happened, so there’s no trend to draw from.
You are right about the singularity, but the underlying trend extrapolation is that of technical progress and, specifically, of software getting smarter.
Nowadays people got used to rapid technical progress and often consider it, um, inevitable. A look at history should disabuse one of that notion, though.
Yudkowsky explicitly doesn’t believe in rapid technical progress. He’s talked about the fact that he believes in the Great Stagnation (slowdown in science/tech/economic progress) which is possibly a good thing since it may retard the creation of AGI, giving people a better shot to work on friendliness first.
Links? What is “rapid”? Did he look at his phone recently?
The Great Stagnation is phenomenon on the time scale of decades. How about the time scale of centuries?
Here is one: https://www.facebook.com/yudkowsky/posts/10152586485749228 .
Yes, he believes in the Great Stagnation. That does not imply he doesn’t believe in rapid technological progress. Again, what is “rapid”?