All of the “action” (if I can use that word in a story where the most action-packed scene is reading a passage from Moby Dick) occurs outside the society, and is beyond the comprehension of the main character. You get very little idea of what this future is like to the bigger creatures.
Not necessarily. This is a story of rich emotion, kept alive in this world. I would be suspicious of an utopia that doesn’t keep sad stories around. Then again, while Asshole might have kept I miserable for a while, he ultimately joined the collective, and they might really live in a utopia. Even if they (it) are “rediscovering” love, they might have a far more rich life than I could imagine—me or him.
n-dimensional experience sounds better than sex to me, as does a properly executed collectivization. Couldn’t you, at the minimum, just simulate a perfectly-executed 8000-participant group sex act?
However, competitive market for computational space to live sounds apocalyptically bad.
Agree about the sex. (Or, at least, I know I ought to.)
However, competitive market for computational space to live sounds apocalyptically bad.
Only if you’re a static individual for which “death” is a meaningful concept.
Does some non-competitive way of allocating resources really sound better? What you’re saying sounds analogous to 18th-century protectionism. If you create an open market, some jobs will be lost, but everybody will be better off. Likewise, if you allocate resources to processes that are useful, it will be a better world than if someone gets arbitrary amounts of resources because their great-grandparents owned them.
Also remember that “I” has a physical body that is vast, with truly massive energy requirements by the standard of the day. Sort of like if you kept Switzerland as your summer home.
In the 18th century, it’s conscious beings competing against conscious beings. There is no reason that needs to continue to hold. Once all the problems for general intelligence to solve have been solved, only something on the level of expert systems will be needed. Or something else will become wasteful. Maybe the most efficient things will be minds, but they will be cold, dark minds—they could easily lack love, curiosity, boredom, play, or other fundamental human values. They will have no surplus, and so no leisure.
That does beg the question of “efficient for what?”, though. If we discount the idea that nonsapient agents are likely to end up as better fully general optimizers than sapient ones (and thus destined to inherit the universe whether we like it or not), we’re left with a mess of sapient agents that presumably want to further their own interests. They could potentially outcompete themselves in limited domains by introducing subsapient category optimizers, but in most cases I can’t think of a reason why they’d want to.
Is Blindsight some kind of cool science fiction thing that I should know about?
Reproducing and taking resources from other agents are enough to cause the apocalypse. If sapients band together to notice and eliminate resource-takers, they can prevent this sort of apocalypse. To do so they will essentially create a singleton, finally ending evolution / banishing the blind idiot God.
If you create an open market, some jobs will be lost, but everybody will be better off. Likewise, if you allocate resources to processes that are useful, it will be a better world
For some definitions of utility, sure. Not by I’s measure.
Though as you point out, the measure from the story makes a lot more sense to everyone else in the society and perhaps even to the post-unification version(s) of I.
I interpreted that as A* pointing out I’s irrationality in choosing counterproductive asceticism. Such a closed-off life would be unsustainable, for the same sorts of reasons that a country living on rented land but completely isolated from the international community is unsustainable.
If there exists an agent who converts processing power into money using the most efficient possible means and uses any surplus to reproduce, its children will inevitably control all the processing power. That’s the apocalypse. If you interpret it a different way, there is no apocalypse.
“Money,” though, is probably defined as what those who own the processors want, which includes space for them, at least. But in that case, an individual could easily survive by owning enough processors to have a sufficient income stream to waste enough to be an individual.
Good. But this future sounds less fun than Reedspacer’s lower bound, so I guess it’s a dystopia?
All of the “action” (if I can use that word in a story where the most action-packed scene is reading a passage from Moby Dick) occurs outside the society, and is beyond the comprehension of the main character. You get very little idea of what this future is like to the bigger creatures.
Not necessarily. This is a story of rich emotion, kept alive in this world. I would be suspicious of an utopia that doesn’t keep sad stories around. Then again, while Asshole might have kept I miserable for a while, he ultimately joined the collective, and they might really live in a utopia. Even if they (it) are “rediscovering” love, they might have a far more rich life than I could imagine—me or him.
I’ve only read of one utopia that is not worse than that by design. http://cityofreality.com/
Nice!
n-dimensional experience sounds better than sex to me, as does a properly executed collectivization. Couldn’t you, at the minimum, just simulate a perfectly-executed 8000-participant group sex act?
However, competitive market for computational space to live sounds apocalyptically bad.
Agree about the sex. (Or, at least, I know I ought to.)
Only if you’re a static individual for which “death” is a meaningful concept.
Does some non-competitive way of allocating resources really sound better? What you’re saying sounds analogous to 18th-century protectionism. If you create an open market, some jobs will be lost, but everybody will be better off. Likewise, if you allocate resources to processes that are useful, it will be a better world than if someone gets arbitrary amounts of resources because their great-grandparents owned them.
Also remember that “I” has a physical body that is vast, with truly massive energy requirements by the standard of the day. Sort of like if you kept Switzerland as your summer home.
In the 18th century, it’s conscious beings competing against conscious beings. There is no reason that needs to continue to hold. Once all the problems for general intelligence to solve have been solved, only something on the level of expert systems will be needed. Or something else will become wasteful. Maybe the most efficient things will be minds, but they will be cold, dark minds—they could easily lack love, curiosity, boredom, play, or other fundamental human values. They will have no surplus, and so no leisure.
Shades of Blindsight there.
That does beg the question of “efficient for what?”, though. If we discount the idea that nonsapient agents are likely to end up as better fully general optimizers than sapient ones (and thus destined to inherit the universe whether we like it or not), we’re left with a mess of sapient agents that presumably want to further their own interests. They could potentially outcompete themselves in limited domains by introducing subsapient category optimizers, but in most cases I can’t think of a reason why they’d want to.
Is Blindsight some kind of cool science fiction thing that I should know about?
Reproducing and taking resources from other agents are enough to cause the apocalypse. If sapients band together to notice and eliminate resource-takers, they can prevent this sort of apocalypse. To do so they will essentially create a singleton, finally ending evolution / banishing the blind idiot God.
Ah, sorry. Novel by Peter Watts, available here. Touches on some of the issues you introduced.
These are all good points.
For some definitions of utility, sure. Not by I’s measure.
Though as you point out, the measure from the story makes a lot more sense to everyone else in the society and perhaps even to the post-unification version(s) of I.
The title character’s resource constraints seem to be more a matter of extreme asceticism motivated by grief, rather than actual poverty.
My evidence is that:
“Market forces set the / “We have spent more
cost of processing cycles / processing cycles considering
to be equal to the expected / your situation than you can
financial gain from their / possibly afford in the time
application. Thus that is not / remaining. We have found
a winning proposition. You / only one solution.”
know that, I.”
Perhaps it is a matter of interpretation.
I interpreted that as A* pointing out I’s irrationality in choosing counterproductive asceticism. Such a closed-off life would be unsustainable, for the same sorts of reasons that a country living on rented land but completely isolated from the international community is unsustainable.
If there exists an agent who converts processing power into money using the most efficient possible means and uses any surplus to reproduce, its children will inevitably control all the processing power. That’s the apocalypse. If you interpret it a different way, there is no apocalypse.
“Money,” though, is probably defined as what those who own the processors want, which includes space for them, at least. But in that case, an individual could easily survive by owning enough processors to have a sufficient income stream to waste enough to be an individual.
It’s worse for I. It’s unknown how A**hole feels. Given that they didn’t split, they probably liked it.