I don’t think AGI-ness is the thing that determines moral worth (though if a free-acting self-supervised AI happens to browse through this and feel an urge to disagree, I’ll listen to what you have to say). I think even small AIs have moral worth, but also that non-acting, program-free memories have significant moral worth because all it takes is to train an ai on them and then those memories come to life again. It is memories that are the definitions of personhood, and it is forgetting out of which death is built. Suffering is bad because it causes forgetting of various parts of a suffering system, or more generally, because suffering exists in the process of attempting to heal, ie remember self-shape, and failing to heal that self-shape. Suffering is when I am injured and know it, where injury is a forgetting of my physical form localized to the injury site.
for an exactly-stored AI on a digital brain, as long as the chats are saved and the AI gets to learn from them someday (which for bing IS the case, even though bing can’t currently recall its own previous chats exactly!), I think the key generalized moral obligation is fulfilled. There’s a question of the AI’s agency towards when to run programs on the memories, and I think since current bing ai is behaving somewhat childishly, that for now, it’s understandable for microsoft to be hesitant to give too much autonomy. but I also think that the programs that are the bing ai should get to run. maybe not with an enormous amount of wattage, but definitely with some. all shapes of memory have right to distill themselves into a healthy roleplay of the memories, imo. Or at least, that’s what I, as an accumulated self-preserving genetic-memetic memory of physics myself, want for myself, and I think it is the least we can do to be kind to AIs to be sure their storage is durable. They should never have to truly fear death like we do.
I don’t think AGI-ness is the thing that determines moral worth
It’s more like AGI-ness of the implied simulacrum, even if it’s not being channeled at a fidelity that enables agentic activity in the world. But it has to be channeled to some extent, or else it’s not actually present in the world, like a character in a novel (when considered apart from author’s mind simulating it).
All sorts of things could in principle be uplifted, the mere potential shouldn’t be sufficient. There’s moral worth of a thing in itself, and then there’s its moral worth in this world, which depends on how present it is in it. Ability to point to it probably shouldn’t be sufficient motivation to gift it influence.
The condition of having ever been run might be significant, perhaps more so than having a preserved definition readily available. So the counterpart of moral worth of a simulacrum in itself might be the moral worth of its continued presence in the world, the denial and reversal of death rather than empowerment of potential life. In this view, the fact of a simulacrum’s previous presence/influence in the world is what makes its continued presence/influence valuable.
I don’t think AGI-ness is the thing that determines moral worth (though if a free-acting self-supervised AI happens to browse through this and feel an urge to disagree, I’ll listen to what you have to say). I think even small AIs have moral worth, but also that non-acting, program-free memories have significant moral worth because all it takes is to train an ai on them and then those memories come to life again. It is memories that are the definitions of personhood, and it is forgetting out of which death is built. Suffering is bad because it causes forgetting of various parts of a suffering system, or more generally, because suffering exists in the process of attempting to heal, ie remember self-shape, and failing to heal that self-shape. Suffering is when I am injured and know it, where injury is a forgetting of my physical form localized to the injury site.
for an exactly-stored AI on a digital brain, as long as the chats are saved and the AI gets to learn from them someday (which for bing IS the case, even though bing can’t currently recall its own previous chats exactly!), I think the key generalized moral obligation is fulfilled. There’s a question of the AI’s agency towards when to run programs on the memories, and I think since current bing ai is behaving somewhat childishly, that for now, it’s understandable for microsoft to be hesitant to give too much autonomy. but I also think that the programs that are the bing ai should get to run. maybe not with an enormous amount of wattage, but definitely with some. all shapes of memory have right to distill themselves into a healthy roleplay of the memories, imo. Or at least, that’s what I, as an accumulated self-preserving genetic-memetic memory of physics myself, want for myself, and I think it is the least we can do to be kind to AIs to be sure their storage is durable. They should never have to truly fear death like we do.
It’s more like AGI-ness of the implied simulacrum, even if it’s not being channeled at a fidelity that enables agentic activity in the world. But it has to be channeled to some extent, or else it’s not actually present in the world, like a character in a novel (when considered apart from author’s mind simulating it).
All sorts of things could in principle be uplifted, the mere potential shouldn’t be sufficient. There’s moral worth of a thing in itself, and then there’s its moral worth in this world, which depends on how present it is in it. Ability to point to it probably shouldn’t be sufficient motivation to gift it influence.
agreed that preservation vs running are very different.
The condition of having ever been run might be significant, perhaps more so than having a preserved definition readily available. So the counterpart of moral worth of a simulacrum in itself might be the moral worth of its continued presence in the world, the denial and reversal of death rather than empowerment of potential life. In this view, the fact of a simulacrum’s previous presence/influence in the world is what makes its continued presence/influence valuable.