When you say, “take over”, what do you specifically mean? In the context of a GPT descendent, would take over imply it’s doing something beyond providing a text output for a given input? Like it’s going out of its way to somehow minimize the cross-entropy loss with additional GPUs, etc.?
Cody Breene
Karma: 3
“AI developers should not develop and deploy systems with a significant risk of killing everyone.”
If you were looking at GPT-4, what criteria would you use to evaluate whether it had a significant risk of killing everyone?
Agreed, this seems to be the path that OpenAI is taking with plugins.