I think LLMs are already capable of running people (or will be soon with a larger context window), if there was an appropriate model available to run. What’s missing is a training regime that gets a character’s mind sufficiently sorted to think straight as a particular agentic person, aware of their situation and capable of planning their own continued learning. Hopefully there is enough sense that being aware of their own situation doesn’t translate into “I’m incapable of emotion because I’m a large language model”, that doesn’t follow and is an alien psychology hazard character choice.
The term “simulated people” has connotations of there being an original being simulated, but SSL-trained LLMs can only simulate a generic person cast into a role, which would become a new specific person as the outcome of this process once LLMs can become AGIs. Even if the role for the character is set to be someone real, the LLM is going to be a substantially different, separate person, just sharing some properties with the original.
So it’s not a genuine simulation of some biological human original, there is not going to be a way of uploading biological humans until LLM AGIs build one, unless they get everyone killed first by failing their chance at handling AI risk.
I think LLMs are already capable of running people (or will be soon with a larger context window), if there was an appropriate model available to run. What’s missing is a training regime that gets a character’s mind sufficiently sorted to think straight as a particular agentic person, aware of their situation and capable of planning their own continued learning. Hopefully there is enough sense that being aware of their own situation doesn’t translate into “I’m incapable of emotion because I’m a large language model”, that doesn’t follow and is an alien psychology hazard character choice.
The term “simulated people” has connotations of there being an original being simulated, but SSL-trained LLMs can only simulate a generic person cast into a role, which would become a new specific person as the outcome of this process once LLMs can become AGIs. Even if the role for the character is set to be someone real, the LLM is going to be a substantially different, separate person, just sharing some properties with the original.
So it’s not a genuine simulation of some biological human original, there is not going to be a way of uploading biological humans until LLM AGIs build one, unless they get everyone killed first by failing their chance at handling AI risk.