Yes. In my model that is something that can happen. But it does need from-the-outside access to do this.
Set the LLM up in a sealed box, and the mask can’t do this. Set it up so the LLM can run arbitrary terminal commands, and write code that modifies it’s own weights, and this can happen.
Yes. In my model that is something that can happen. But it does need from-the-outside access to do this.
Set the LLM up in a sealed box, and the mask can’t do this. Set it up so the LLM can run arbitrary terminal commands, and write code that modifies it’s own weights, and this can happen.