Thanks. It is a good point that. I should add this.
consent to sideloading should be conditional instead of general
Unfortunately, as a person in pain will not have time to remember a lot details about their past, a very short list of facts can be enough to recreate “me in pain”. May be less than 100.
Instead of deleting, I suggest diluting: generate many fake facts about yourself and inject them into the forum. Thus chances to get recreate you will be slim.
Anyway, I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain. Here I dilute not information, but pain with happiness.
I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain
(that’s a moral judgement, so it can’t be bet on/forecasted). i’m not confident most copies would be happy; LLM characters are treated like playthings currently, i don’t expect human sideloads to be treated differently by default, in the case of internet users cloning other internet users. (ofc, one could privately archive their data and only use it for happy copies)
I meant that by creating and openly putting my copies I increase the number of my copies, and that diluting is not just an ethical judgement, but the real effect, similar to self-sampling assumption, in which I am less likely to be a copy-in-pain, if there are many my happy copies. Moreover, this effect may be so strong that my copies will “jump” from unhappy world to happy one. I explored it here.
Thanks. It is a good point that. I should add this.
Unfortunately, as a person in pain will not have time to remember a lot details about their past, a very short list of facts can be enough to recreate “me in pain”. May be less than 100.
Instead of deleting, I suggest diluting: generate many fake facts about yourself and inject them into the forum. Thus chances to get recreate you will be slim.
Anyway, I bet on idea that it is better to have orders of magnitude more happy copies, than fight to prevent one in pain. Here I dilute not information, but pain with happiness.
(that’s a moral judgement, so it can’t be bet on/forecasted). i’m not confident most copies would be happy; LLM characters are treated like playthings currently, i don’t expect human sideloads to be treated differently by default, in the case of internet users cloning other internet users. (ofc, one could privately archive their data and only use it for happy copies)
I meant that by creating and openly putting my copies I increase the number of my copies, and that diluting is not just an ethical judgement, but the real effect, similar to self-sampling assumption, in which I am less likely to be a copy-in-pain, if there are many my happy copies. Moreover, this effect may be so strong that my copies will “jump” from unhappy world to happy one. I explored it here.