Sure, but it seems like everyone died at some point anyway, and some collective copies of them went on?
I don’t think so. I think they seem to be extremely lonely and sad and the AIs are the only way for them to get any form of empowerment. And each time they try to inch further with empowering themselves with the AIs, it leads to the AI actually getting more powerful and themselves only getting a brief moment of more power, but ultimately degrading in mental capacity. And needing to empower the AI more and more, like an addict needing an ever greater high. Until there is nothing left for them to do, but Die and let the AI become the ultimate power.
I don’t particularly care if some non human semisentients manage to be kind of moral/good at coordinating, if it came at what seems to be the cost of all human life.
Even if offscreen all of humanity didn’t die, these people dying, killing themselves and never realizing what’s actually happening is still insanely horrific and tragic.
Yeah, but I’m contrasting this with (IMO more likely) futures where everyone dies, and nothing that’s remotely like a human copy goes on. Even if you conceptualize it as “these people died”, I think there are much worse possibilities for what sort of entity continues into the future. (i.e. a non sentient AI with no human/social/creative/emotional values, that just tiles the universe with simple struggles). or “this story happens, but with even less agency and more blatantly dystopian outcomes.”)
[of course, the reason I described this as “optimistic” instead of “less pessimistic than I expect” is that I don’t think the characters died, I think if you slowly augment yourself with AI tools, the pattern of you counts as “you” even as it starts to be instantiated in silicon, so I think this is just a pretty good outcome. I also think the world (implies) many people thinking about moral / personhood philosophy before taking the final plunge. I don’t think there’s anything even plausibly wrong with the first couple chunks, and I think the second half contains a lot of qualifiers (such as integrating his multiple memories into a central node) that make it pretty unobjectionable.
I realize you don’t believe that, and, seems fine for you to see it as horror. It’s been awhile since I discussed the “does a copy of you count as you” and I might be up for discussing that if you want to argue about it, but also seems fine to leave as-is]
Sure? I agree this is less bad than ‘literally everyone dying and that’s it’, assuming there’s humans around, living, still empowered, etc in the background.
I was saying overall, as a story, I find it horrifying, especially contrasting with how some seem to see it utopic.
Sure, but it seems like everyone died at some point anyway, and some collective copies of them went on?
I don’t think so. I think they seem to be extremely lonely and sad and the AIs are the only way for them to get any form of empowerment. And each time they try to inch further with empowering themselves with the AIs, it leads to the AI actually getting more powerful and themselves only getting a brief moment of more power, but ultimately degrading in mental capacity. And needing to empower the AI more and more, like an addict needing an ever greater high. Until there is nothing left for them to do, but Die and let the AI become the ultimate power.
I don’t particularly care if some non human semisentients manage to be kind of moral/good at coordinating, if it came at what seems to be the cost of all human life.
Even if offscreen all of humanity didn’t die, these people dying, killing themselves and never realizing what’s actually happening is still insanely horrific and tragic.
Yeah, but I’m contrasting this with (IMO more likely) futures where everyone dies, and nothing that’s remotely like a human copy goes on. Even if you conceptualize it as “these people died”, I think there are much worse possibilities for what sort of entity continues into the future. (i.e. a non sentient AI with no human/social/creative/emotional values, that just tiles the universe with simple struggles). or “this story happens, but with even less agency and more blatantly dystopian outcomes.”)
[of course, the reason I described this as “optimistic” instead of “less pessimistic than I expect” is that I don’t think the characters died, I think if you slowly augment yourself with AI tools, the pattern of you counts as “you” even as it starts to be instantiated in silicon, so I think this is just a pretty good outcome. I also think the world (implies) many people thinking about moral / personhood philosophy before taking the final plunge. I don’t think there’s anything even plausibly wrong with the first couple chunks, and I think the second half contains a lot of qualifiers (such as integrating his multiple memories into a central node) that make it pretty unobjectionable.
I realize you don’t believe that, and, seems fine for you to see it as horror. It’s been awhile since I discussed the “does a copy of you count as you” and I might be up for discussing that if you want to argue about it, but also seems fine to leave as-is]
Sure? I agree this is less bad than ‘literally everyone dying and that’s it’, assuming there’s humans around, living, still empowered, etc in the background.
I was saying overall, as a story, I find it horrifying, especially contrasting with how some seem to see it utopic.
Nod. I’m just answering your question of why I consider it optimistic.