1. AI didn’t just kill everyone 5% of the way through the story
2. IMO, the characters in this story basically get the opportunity to reflect on what is good for them before taking each additional step. (they maybe feel some pressure to Keep Up With The Joneses, re: AI assisted thinking. But, that pressure isn’t super crazy strong. Like the character’s boss isn’t strongly implying that if they don’t take these upgrades they lose their job.)
3. Even if you think the way the characters are making their choices here are more dystopian and they were dragged along a current of change that was maybe bad and maybe even literally destroyed their minds, the beings that end up existing at the end seem to value many of the things I value, and seem to be pretty good at coordinating at scale about how to make good choices both individually and collectively. (like, if you think they basically foolishly died, the thing that replaces them could have been much worse). ((I don’t think that they foolishly died, but, the argument about that probably doesn’t fit in this margin well)
However bad you think this outcome is societally, I think it could have been much worse – a rushed panic to adopt each new technology immediately, it could be the case that the earth gets converted to computronium quickly (even if decided via recent-posthumans) and people who don’t adopt AI tech are either killed or uploaded against their will. (You might think this happened in this story offscreen, which is maybe reasonable. I think it’s implied pretty strongly that it didn’t)
Sure, but it seems like everyone died at some point anyway, and some collective copies of them went on?
I don’t think so. I think they seem to be extremely lonely and sad and the AIs are the only way for them to get any form of empowerment. And each time they try to inch further with empowering themselves with the AIs, it leads to the AI actually getting more powerful and themselves only getting a brief moment of more power, but ultimately degrading in mental capacity. And needing to empower the AI more and more, like an addict needing an ever greater high. Until there is nothing left for them to do, but Die and let the AI become the ultimate power.
I don’t particularly care if some non human semisentients manage to be kind of moral/good at coordinating, if it came at what seems to be the cost of all human life.
Even if offscreen all of humanity didn’t die, these people dying, killing themselves and never realizing what’s actually happening is still insanely horrific and tragic.
Yeah, but I’m contrasting this with (IMO more likely) futures where everyone dies, and nothing that’s remotely like a human copy goes on. Even if you conceptualize it as “these people died”, I think there are much worse possibilities for what sort of entity continues into the future. (i.e. a non sentient AI with no human/social/creative/emotional values, that just tiles the universe with simple struggles). or “this story happens, but with even less agency and more blatantly dystopian outcomes.”)
[of course, the reason I described this as “optimistic” instead of “less pessimistic than I expect” is that I don’t think the characters died, I think if you slowly augment yourself with AI tools, the pattern of you counts as “you” even as it starts to be instantiated in silicon, so I think this is just a pretty good outcome. I also think the world (implies) many people thinking about moral / personhood philosophy before taking the final plunge. I don’t think there’s anything even plausibly wrong with the first couple chunks, and I think the second half contains a lot of qualifiers (such as integrating his multiple memories into a central node) that make it pretty unobjectionable.
I realize you don’t believe that, and, seems fine for you to see it as horror. It’s been awhile since I discussed the “does a copy of you count as you” and I might be up for discussing that if you want to argue about it, but also seems fine to leave as-is]
Sure? I agree this is less bad than ‘literally everyone dying and that’s it’, assuming there’s humans around, living, still empowered, etc in the background.
I was saying overall, as a story, I find it horrifying, especially contrasting with how some seem to see it utopic.
I would be curious whether you consider The Gentle Seduction to be optimistic. I think it has fewer elements that you mentioned finding dystopian in another comment, but I find the two trajectories similarly good.
How is this optimistic.
Well, in this world:
1. AI didn’t just kill everyone 5% of the way through the story
2. IMO, the characters in this story basically get the opportunity to reflect on what is good for them before taking each additional step. (they maybe feel some pressure to Keep Up With The Joneses, re: AI assisted thinking. But, that pressure isn’t super crazy strong. Like the character’s boss isn’t strongly implying that if they don’t take these upgrades they lose their job.)
3. Even if you think the way the characters are making their choices here are more dystopian and they were dragged along a current of change that was maybe bad and maybe even literally destroyed their minds, the beings that end up existing at the end seem to value many of the things I value, and seem to be pretty good at coordinating at scale about how to make good choices both individually and collectively. (like, if you think they basically foolishly died, the thing that replaces them could have been much worse). ((I don’t think that they foolishly died, but, the argument about that probably doesn’t fit in this margin well)
However bad you think this outcome is societally, I think it could have been much worse – a rushed panic to adopt each new technology immediately, it could be the case that the earth gets converted to computronium quickly (even if decided via recent-posthumans) and people who don’t adopt AI tech are either killed or uploaded against their will. (You might think this happened in this story offscreen, which is maybe reasonable. I think it’s implied pretty strongly that it didn’t)
Sure, but it seems like everyone died at some point anyway, and some collective copies of them went on?
I don’t think so. I think they seem to be extremely lonely and sad and the AIs are the only way for them to get any form of empowerment. And each time they try to inch further with empowering themselves with the AIs, it leads to the AI actually getting more powerful and themselves only getting a brief moment of more power, but ultimately degrading in mental capacity. And needing to empower the AI more and more, like an addict needing an ever greater high. Until there is nothing left for them to do, but Die and let the AI become the ultimate power.
I don’t particularly care if some non human semisentients manage to be kind of moral/good at coordinating, if it came at what seems to be the cost of all human life.
Even if offscreen all of humanity didn’t die, these people dying, killing themselves and never realizing what’s actually happening is still insanely horrific and tragic.
Yeah, but I’m contrasting this with (IMO more likely) futures where everyone dies, and nothing that’s remotely like a human copy goes on. Even if you conceptualize it as “these people died”, I think there are much worse possibilities for what sort of entity continues into the future. (i.e. a non sentient AI with no human/social/creative/emotional values, that just tiles the universe with simple struggles). or “this story happens, but with even less agency and more blatantly dystopian outcomes.”)
[of course, the reason I described this as “optimistic” instead of “less pessimistic than I expect” is that I don’t think the characters died, I think if you slowly augment yourself with AI tools, the pattern of you counts as “you” even as it starts to be instantiated in silicon, so I think this is just a pretty good outcome. I also think the world (implies) many people thinking about moral / personhood philosophy before taking the final plunge. I don’t think there’s anything even plausibly wrong with the first couple chunks, and I think the second half contains a lot of qualifiers (such as integrating his multiple memories into a central node) that make it pretty unobjectionable.
I realize you don’t believe that, and, seems fine for you to see it as horror. It’s been awhile since I discussed the “does a copy of you count as you” and I might be up for discussing that if you want to argue about it, but also seems fine to leave as-is]
Sure? I agree this is less bad than ‘literally everyone dying and that’s it’, assuming there’s humans around, living, still empowered, etc in the background.
I was saying overall, as a story, I find it horrifying, especially contrasting with how some seem to see it utopic.
Nod. I’m just answering your question of why I consider it optimistic.
I would be curious whether you consider The Gentle Seduction to be optimistic. I think it has fewer elements that you mentioned finding dystopian in another comment, but I find the two trajectories similarly good.