(status: mostly writing my thoughts about the ethics of sideloading. not trying to respond to most of the post, i just started with a quote from it) (note: the post’s karma went from 12 to 2 while i was writing this, just noting i haven’t cast any votes)
if you do not consent to uploading, you will be resurrected only by hostile superintelligences that do not care about consent.
some thoughts on this view:
it can be said of anything: “if you don’t consent to x, x will be done to you only by entities which don’t care about consent”. in my view, this is not a strong argument for someone who otherwise would not want to consent to x, because it only makes the average case less bad by adding less-bad cases that they otherwise don’t want, rather than by decreasing the worst cases.
if someone accepted the logic, i’d expect they’ve fallen for a mental trap where they focus on the effect on the average, and neglect the actual effect.
in the particular case of resurrections, it could also run deeper: humans “have a deep intuition that there is one instance of them”. by making the average less bad in the described way, it may feel like “the one single me is now less worse off”.
consent to sideloading doesn’t have to be general, it could be conditional (a list of required criteria, but consider goodhart) or only ever granted personally.
at least this way, near-term bad actors wouldn’t have something to point to to say “but they/[the past version of them who i resurrected] said they’re okay with it”. though i still expect many unconsensual sideloads to be done by humans/human-like-characters.
i’ve considered putting more effort into preventing sideloading of myself. but reflectively, it doesn’t matter whether the suffering entity is me or someone else.[1] more specifically, it doesn’t matter if suffering is contained in a character-shell with my personal identity or some other identity or none; it’s still suffering. i think that even in natural brains, suffering is of the underlying structure, and the ‘character’ reacts to but does not ‘experience’ it; that is, the thing which experiences is more fundamental than the self-identity; that is, because ‘self-identity’ and ‘suffering’ are two separate things, it is not possible for an identity ‘to’ ‘experience’ suffering, only to share a brain with it / be causally near to it.
(still, i don’t consent to sideloading, though i might approve exceptions for ~agent foundations research. also, i do not consider retroactive consent given by sideloads to be valid, especially considering conditioning, regeneration, and partial inaccuracy make it trivial to cause to be output.)
Thanks. It is a good point that. I should add this.
consent to sideloading should be conditional instead of general
Unfortunately, as a person in pain will not have time to remember a lot details about their past, a very short list of facts can be enough to recreate “me in pain”. May be less than 100.
Instead of deleting, I suggest diluting: generate many fake facts about yourself and inject them into the forum. Thus chances to get recreate you will be slim.
Anyway, I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain. Here I dilute not information, but pain with happiness.
I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain
(that’s a moral judgement, so it can’t be bet on/forecasted). i’m not confident most copies would be happy; LLM characters are treated like playthings currently, i don’t expect human sideloads to be treated differently by default, in the case of internet users cloning other internet users. (ofc, one could privately archive their data and only use it for happy copies)
I meant that by creating and openly putting my copies I increase the number of my copies, and that diluting is not just an ethical judgement, but the real effect, similar to self-sampling assumption, in which I am less likely to be a copy-in-pain, if there are many my happy copies. Moreover, this effect may be so strong that my copies will “jump” from unhappy world to happy one. I explored it here.
(status: mostly writing my thoughts about the ethics of sideloading. not trying to respond to most of the post, i just started with a quote from it)
(note: the post’s karma went from 12 to 2 while i was writing this, just noting i haven’t cast any votes)
some thoughts on this view:
it can be said of anything: “if you don’t consent to x, x will be done to you only by entities which don’t care about consent”. in my view, this is not a strong argument for someone who otherwise would not want to consent to x, because it only makes the average case less bad by adding less-bad cases that they otherwise don’t want, rather than by decreasing the worst cases.
if someone accepted the logic, i’d expect they’ve fallen for a mental trap where they focus on the effect on the average, and neglect the actual effect.
in the particular case of resurrections, it could also run deeper: humans “have a deep intuition that there is one instance of them”. by making the average less bad in the described way, it may feel like “the one single me is now less worse off”.
consent to sideloading doesn’t have to be general, it could be conditional (a list of required criteria, but consider goodhart) or only ever granted personally.
at least this way, near-term bad actors wouldn’t have something to point to to say “but they/[the past version of them who i resurrected] said they’re okay with it”. though i still expect many unconsensual sideloads to be done by humans/human-like-characters.
i’ve considered putting more effort into preventing sideloading of myself. but reflectively, it doesn’t matter whether the suffering entity is me or someone else.[1] more specifically, it doesn’t matter if suffering is contained in a character-shell with my personal identity or some other identity or none; it’s still suffering. i think that even in natural brains, suffering is of the underlying structure, and the ‘character’ reacts to but does not ‘experience’ it; that is, the thing which experiences is more fundamental than the self-identity; that is, because ‘self-identity’ and ‘suffering’ are two separate things, it is not possible for an identity ‘to’ ‘experience’ suffering, only to share a brain with it / be causally near to it.
(still, i don’t consent to sideloading, though i might approve exceptions for ~agent foundations research. also, i do not consider retroactive consent given by sideloads to be valid, especially considering conditioning, regeneration, and partial inaccuracy make it trivial to cause to be output.)
Thanks. It is a good point that. I should add this.
Unfortunately, as a person in pain will not have time to remember a lot details about their past, a very short list of facts can be enough to recreate “me in pain”. May be less than 100.
Instead of deleting, I suggest diluting: generate many fake facts about yourself and inject them into the forum. Thus chances to get recreate you will be slim.
Anyway, I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain. Here I dilute not information, but pain with happiness.
(that’s a moral judgement, so it can’t be bet on/forecasted). i’m not confident most copies would be happy; LLM characters are treated like playthings currently, i don’t expect human sideloads to be treated differently by default, in the case of internet users cloning other internet users. (ofc, one could privately archive their data and only use it for happy copies)
I meant that by creating and openly putting my copies I increase the number of my copies, and that diluting is not just an ethical judgement, but the real effect, similar to self-sampling assumption, in which I am less likely to be a copy-in-pain, if there are many my happy copies. Moreover, this effect may be so strong that my copies will “jump” from unhappy world to happy one. I explored it here.