The first three bullet points in the grandparent are the most important ideas associated with the Singularity that everyone needs to know about. They accord with the Singularity Institute’s predictions, with the core claims of Yudkowsky’s three Singularity “schools”, and Robin Hanson’s ems scenario.
When I read Carrier’s first sentence (“I agree the Singularity stuff is often muddled nonsense.”), I assumed he would he would be skeptical of those three points in one of the usual ways. But instead he actually affirms two of them:
machines will outthink humans (and be designing better versions of themselves than we ever could) within fifty to a hundred years
(He doesn’t address my second bullet point.) It’s not clear what the “Singularity” means to him; but from his criticisms, it seems to have something to do with IQ, and whether machine intelligences will be AIs or uploads or whatever.
My point is that if and when the Singularity Institute succeeds in convincing everyone of the first three bullet points in the grandparent, it will still be fashionable to dismiss the Singularity hypothesis, because everyone has their own strawman version of what the Singularity is.
I heard David Pearce speak recently, and he mentioned in passing that he is “not a starry-eyed Singularitarian”, and by this he seemed to mean that he thought brain uploading was infeasible. But elsewhere in the talk he spoke casually of utilitronium shockwaves and jupiter brains and pleasure plasma.
I don’t even know what the Singularity is anymore. In fact, I never did. I suspect that the disclaimer “I am not a Singularitarian” means “I am not Eliezer Yudkowsky circa 1999, although everything he says nowadays is quite reasonable.”
Ah, I get it now! I think Carrier’s argument was that computing power will not exceed that as already predicted by Moore’s Law, though I’m not exactly sure why, or how that disproves the Singularity.
and by this he seemed to mean that he thought brain uploading was infeasible. But elsewhere in the talk he spoke casually of utilitronium shockwaves and jupiter brains and pleasure plasma
Wait, what? Isn’t brain uploading obviously easier than those other things?
I can’t speak for him, but he possibly meant that brain uploading won’t be feasible for a while, and the posthuman era will be ushered in with implants, gene therapy, drugs, etc.
I don’t get it.
The first three bullet points in the grandparent are the most important ideas associated with the Singularity that everyone needs to know about. They accord with the Singularity Institute’s predictions, with the core claims of Yudkowsky’s three Singularity “schools”, and Robin Hanson’s ems scenario.
When I read Carrier’s first sentence (“I agree the Singularity stuff is often muddled nonsense.”), I assumed he would he would be skeptical of those three points in one of the usual ways. But instead he actually affirms two of them:
(He doesn’t address my second bullet point.) It’s not clear what the “Singularity” means to him; but from his criticisms, it seems to have something to do with IQ, and whether machine intelligences will be AIs or uploads or whatever.
My point is that if and when the Singularity Institute succeeds in convincing everyone of the first three bullet points in the grandparent, it will still be fashionable to dismiss the Singularity hypothesis, because everyone has their own strawman version of what the Singularity is.
I heard David Pearce speak recently, and he mentioned in passing that he is “not a starry-eyed Singularitarian”, and by this he seemed to mean that he thought brain uploading was infeasible. But elsewhere in the talk he spoke casually of utilitronium shockwaves and jupiter brains and pleasure plasma.
I don’t even know what the Singularity is anymore. In fact, I never did. I suspect that the disclaimer “I am not a Singularitarian” means “I am not Eliezer Yudkowsky circa 1999, although everything he says nowadays is quite reasonable.”
Ah, I get it now! I think Carrier’s argument was that computing power will not exceed that as already predicted by Moore’s Law, though I’m not exactly sure why, or how that disproves the Singularity.
Wait, what? Isn’t brain uploading obviously easier than those other things?
I can’t speak for him, but he possibly meant that brain uploading won’t be feasible for a while, and the posthuman era will be ushered in with implants, gene therapy, drugs, etc.