There are some unlikely technologies used: transports rolling down a road using wheels, headlights, tape as a backup device. Some people at Clarion objected to the use of tape for backup, but someone else pointed out that tape is a general-purpose way of giving a 2D storage media 3D packing efficiency; so it’s likely to always be more space-efficient than any pure 3D random-access memory media.
A VR overlay on the physical world is a very indirect and inefficient way of conveying information to beings with writable random-access memories. But it’s not absurd; it would at least avoid problems with standards and backwards-compatibility.
The expression of time in years and days is unlikely. There’s no compression of subjective time indicated. Something with the computational complexity of a human would be more likely to measure time in microseconds. The timescale (collectives sometime around 2229) seems more likely to be too long than too short; but a good estimate for something with an asymmetric distribution should have that property. That date may have been a deliberate compromise with reader expectations. In the story, there seems to have been a gradual transition rather than a Singularity, as people could join collectives. I’m okay with that.
The premise that “I” can be interesting to the collectives, and yet there is only one such person, is probably not plausible. If individuals were a little more interesting, there would be more of them; a little less interesting, and there would be zero of them. It’s extremely unlikely that this would balance out so precisely for so long.
The idea at the very end, that the collectives don’t experience pride and love, surprised me the most, and I don’t remember why I wrote it that way. At least it doesn’t say they are unacquainted with them (I assume they have histories).
The idea at the very end, that the collectives don’t experience pride and love, surprised me the most, and I don’t remember why I wrote it that way. At least it doesn’t say they are unacquainted with them (I assume they have histories).
These are broad terms, especially “love”. I believe they are meant here in a narrow sense (hence, for example, they are called “emotions”, while love is not an emotion in its entirety.)
ETA: though, the core meaning of “love” as “the decision to value a person by token” (opposed to valuing the content) might be what’s meant. I guess what I’m trying to say is that the collective have what they would best call “love” but it would still be (essentially) different to what the “original” love of I is.
That’s a spectacular story.
What are the compromises you needed to make with what you actually think about AI and the singularity?
Thanks!
There are some unlikely technologies used: transports rolling down a road using wheels, headlights, tape as a backup device. Some people at Clarion objected to the use of tape for backup, but someone else pointed out that tape is a general-purpose way of giving a 2D storage media 3D packing efficiency; so it’s likely to always be more space-efficient than any pure 3D random-access memory media.
A VR overlay on the physical world is a very indirect and inefficient way of conveying information to beings with writable random-access memories. But it’s not absurd; it would at least avoid problems with standards and backwards-compatibility.
The expression of time in years and days is unlikely. There’s no compression of subjective time indicated. Something with the computational complexity of a human would be more likely to measure time in microseconds. The timescale (collectives sometime around 2229) seems more likely to be too long than too short; but a good estimate for something with an asymmetric distribution should have that property. That date may have been a deliberate compromise with reader expectations. In the story, there seems to have been a gradual transition rather than a Singularity, as people could join collectives. I’m okay with that.
The premise that “I” can be interesting to the collectives, and yet there is only one such person, is probably not plausible. If individuals were a little more interesting, there would be more of them; a little less interesting, and there would be zero of them. It’s extremely unlikely that this would balance out so precisely for so long.
The idea at the very end, that the collectives don’t experience pride and love, surprised me the most, and I don’t remember why I wrote it that way. At least it doesn’t say they are unacquainted with them (I assume they have histories).
These are broad terms, especially “love”. I believe they are meant here in a narrow sense (hence, for example, they are called “emotions”, while love is not an emotion in its entirety.)
ETA: though, the core meaning of “love” as “the decision to value a person by token” (opposed to valuing the content) might be what’s meant. I guess what I’m trying to say is that the collective have what they would best call “love” but it would still be (essentially) different to what the “original” love of I is.