Further comments, which I’m making in the safe haven of this topic rather than the wilds of the rest of LW:
I’m moderately sympathetic to all the cryonics / singularity stuff that’s often talked about here, but also suspicious. I haven’t come up with a properly argued response, (or even read all the very long posts about it!), but LW in general gives me a feeling of twisting things to fit already chosen conclusions on these topics.
Cryonics: I view it as a long-shot option with a possible big payoff. The part I have my doubts about is the feeling I get that it’s seen as a particularly good long-shot that’s important to focus on.
Singularity stuff: This has all very possibly been discussed at length in a long post I haven’t read, and I’m quite happy to get references. Two areas of this make me uncomfortable:
For me a key problem seems to be the rate at which people can adapt to new technologies. I’m sure I’ve seen this raised either in Marooned in Realtime (http://en.wikipedia.org/wiki/Marooned_in_Realtime) or in very standard commentary on it, so I’m sure this has been addressed somewhere. This seems likely to me to stop acceleration in technology once we reach the stage of significant change within a human lifetime.
Someone still has to do all the thinking. Assuming the singularity happens, and as yet undefined entities can solve major problems in short timespans, this will be because they are thinking very fast. They will be operating on a much faster time scale and to them, the apparent rate of progress won’t be much greater. The singularity will only appear to solve all our problems by handwaving from the point of view of the un-accelerated. Which around here seems to be viewed as an unpleasant state of existence, to be escaped as soon as the technology is available.
Further comments, which I’m making in the safe haven of this topic rather than the wilds of the rest of LW:
I’m moderately sympathetic to all the cryonics / singularity stuff that’s often talked about here, but also suspicious. I haven’t come up with a properly argued response, (or even read all the very long posts about it!), but LW in general gives me a feeling of twisting things to fit already chosen conclusions on these topics.
Cryonics: I view it as a long-shot option with a possible big payoff. The part I have my doubts about is the feeling I get that it’s seen as a particularly good long-shot that’s important to focus on.
Singularity stuff: This has all very possibly been discussed at length in a long post I haven’t read, and I’m quite happy to get references. Two areas of this make me uncomfortable:
For me a key problem seems to be the rate at which people can adapt to new technologies. I’m sure I’ve seen this raised either in Marooned in Realtime (http://en.wikipedia.org/wiki/Marooned_in_Realtime) or in very standard commentary on it, so I’m sure this has been addressed somewhere. This seems likely to me to stop acceleration in technology once we reach the stage of significant change within a human lifetime.
Someone still has to do all the thinking. Assuming the singularity happens, and as yet undefined entities can solve major problems in short timespans, this will be because they are thinking very fast. They will be operating on a much faster time scale and to them, the apparent rate of progress won’t be much greater. The singularity will only appear to solve all our problems by handwaving from the point of view of the un-accelerated. Which around here seems to be viewed as an unpleasant state of existence, to be escaped as soon as the technology is available.