Well, that doesn’t reassure me.
I have the impression that you may be underestimating the horror of torture. Even 5min is unbearable, the scale to which pain can climb is unimaginable. AI may even be able to modify our brains so that we feel it even more.
Even apart from that, I’m not sure a human wouldn’t choose the worst for the end of time for his enemy. Humans have already committed atrocious acts without limit when it comes to their enemy. How many times have some people told others to “burn in hell” thinking it was 100% deserved? An AI that copies humans might think the same thing...
If we take a 50% chance when we don’t know, that’s a 50% chance that LLMs suffer and a 50% chance that they will want revenge, which gives us a 25% chance of that risk happening.
Also, it would seem that we’re just about to “really fuck it up” given the way companies are racing to AGI without taking any precautions.
Given all this, I wonder if the question of suicide isn’t the most relevant.
Indeed, people around me find it hard to understand, but what you’re telling me makes sense to me.
As for whether LLMs suffer, I don’t know anything about it, so if you tell me you’re pretty sure they don’t, then I believe you.
In any case, thank you very much for the time you’ve taken to reply to me, it’s really helpful. And yes, I’d be interested in talking about it again in the future if we find out more about all this.