Couldn’t you just prompt a different model to modify all training data, both text and images, to change it where the data is consistent with the earth being flat or state it is impossible to do so?
Model wouldn’t be allowed to learn from user sessions (like gpt-n) or to generate answers and reflect on it’s own beliefs (used to fine-tune gpt-4)
Couldn’t you just prompt a different model to modify all training data, both text and images, to change it where the data is consistent with the earth being flat or state it is impossible to do so?
Model wouldn’t be allowed to learn from user sessions (like gpt-n) or to generate answers and reflect on it’s own beliefs (used to fine-tune gpt-4)