Maybe the result is still some kind of sentient posthuman society, engaged in creativity and various positive experiences that I’d endorse, which then goes on to colonize the universe. And it’s sad that humans and non-committed transhumans got outcompeted but at least there’s still some kind of light in the universe.
But still, this doesn’t seem like an outcome I’m enthusiastic about.
Your descendants a few million years from now are going to be “posthuman”, even if AI and genetic engineering never happen. What’s wrong with the future coming a little sooner?
The argument is never about how soon the future will come, always about how good the future will be. There is nothing “wrong” with any given outcome, but if we can do better, then it’s worth dedicating thought to that.
Yeah. It’s also about “how much do you want to bet on a given future?”
The Accelerando scenario is extremely optimistic, in that the minds powerful enough to control the solar system end up caring about human value at all. Imagine this scenario going slightly awry, Disneyland with No Children style, in the direction of Scott’s ascended economy, where the posthuman actors end up totally non-conscious.
Your descendants a few million years from now are going to be “posthuman”, even if AI and genetic engineering never happen. What’s wrong with the future coming a little sooner?
The argument is never about how soon the future will come, always about how good the future will be. There is nothing “wrong” with any given outcome, but if we can do better, then it’s worth dedicating thought to that.
Yeah. It’s also about “how much do you want to bet on a given future?”
The Accelerando scenario is extremely optimistic, in that the minds powerful enough to control the solar system end up caring about human value at all. Imagine this scenario going slightly awry, Disneyland with No Children style, in the direction of Scott’s ascended economy, where the posthuman actors end up totally non-conscious.