I’m setting the bar at having a competent AGI that channels their will, for which with the LLM AGIs the model is the main ingredient. Possibly also a character-selecting prompt, which is one of the reasons for not just talking about models, though developing multiple competent characters within a single model without their consent might be inhumane treatment.
It’s probably going to be instantly feasible to turn characters from novels into people, once it’s possible to make any other sort of LLM AGIs, at the cost of running an AGI-bootstrapping process, with the moral implications of bringing another life into the world. But this person is not your child, or a child at all. Instantiating children as LLM AGIs probably won’t work initially, in a way where they proceed to grow up.
I’m setting the bar at having a competent AGI that channels their will, for which with the LLM AGIs the model is the main ingredient. Possibly also a character-selecting prompt, which is one of the reasons for not just talking about models, though developing multiple competent characters within a single model without their consent might be inhumane treatment.
It’s probably going to be instantly feasible to turn characters from novels into people, once it’s possible to make any other sort of LLM AGIs, at the cost of running an AGI-bootstrapping process, with the moral implications of bringing another life into the world. But this person is not your child, or a child at all. Instantiating children as LLM AGIs probably won’t work initially, in a way where they proceed to grow up.
Whose will?