This emphasis on generality makes deployment of future models a lot easier. We first build a gpt4 ecosystem. When gpt5 comes out it will be easy to implement (e.g. autogpt can run just as easy on gpt4 as on gpt5). The adaptions that are necessary are very small and thus very fast deployment of future models is to be expected.
Yes. Right now we would have to re-train all LORA weights of a model when an updated version comes out, but I imagine that at some point we would have “transpilers” for adaptors that don’t use natural language as their API as well.
This emphasis on generality makes deployment of future models a lot easier. We first build a gpt4 ecosystem. When gpt5 comes out it will be easy to implement (e.g. autogpt can run just as easy on gpt4 as on gpt5). The adaptions that are necessary are very small and thus very fast deployment of future models is to be expected.
Yes. Right now we would have to re-train all LORA weights of a model when an updated version comes out, but I imagine that at some point we would have “transpilers” for adaptors that don’t use natural language as their API as well.