I’ve also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.
edit: Didn’t mean that as a comprehensive definition.
Thanks for saving me from karmic hell, but I still don’t see the conflict. I seem to follow the Vinge version, which doesn’t appear to be proscribed.
I may have been too categorical, obviously one can make all the predictions he likes, and some with a high percentage of certainty, for instance “If cryorevival is possible then post singularity it will be trivial to implement” but that still doesn’t give us any certainty that this will be so, for instance a post singularity paperclip maximizer would be capable of cryorevival but have no interest in it.
Depends on your objectives. If you believe the singularity is something that will happen regardless then it’s harmless to spin scenarios. I gather that people like Elizier figure that the Singularity will happen unavoidably but that it can be steered towards optimum outcomes by setting down the initial parameters, in which case I suppose it’s good to have an official line about “how things could be/how we want things to be”
I’ve also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.
I’ve also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.
edit: Didn’t mean that as a comprehensive definition.
That is not the most common usage here. See Three Singularity Schools and the LW wiki page.
EDIT: The parent comment does not deserve to be at −4. This is a reasonable thing for an inexperienced commenter to say.
Voted up for niceness.
Thanks for saving me from karmic hell, but I still don’t see the conflict. I seem to follow the Vinge version, which doesn’t appear to be proscribed.
I may have been too categorical, obviously one can make all the predictions he likes, and some with a high percentage of certainty, for instance “If cryorevival is possible then post singularity it will be trivial to implement” but that still doesn’t give us any certainty that this will be so, for instance a post singularity paperclip maximizer would be capable of cryorevival but have no interest in it.
If that were true about the Singularity, then wouldn’t it be correct to criticize the people who make predictions about it?
Depends on your objectives. If you believe the singularity is something that will happen regardless then it’s harmless to spin scenarios. I gather that people like Elizier figure that the Singularity will happen unavoidably but that it can be steered towards optimum outcomes by setting down the initial parameters, in which case I suppose it’s good to have an official line about “how things could be/how we want things to be”
I dispute that point.
There is no “point beyond which predictions cannot be made”. That is a SF fantasy.
God forbid someone might mistake our hypothetical discussions about future smarter than human artificial intelligences for science fiction.