Yeah, I was thinking that you might be able to fill the context adequately, because otherwise you would have to be in an awkward spot where you have too many examples to cheaply include them in the prompt to make the small cheap models work out, but also still not enough for finetuning to really shine by training a larger high-end model over millions of tokens to zero-shot it.
Yeah, I was thinking that you might be able to fill the context adequately, because otherwise you would have to be in an awkward spot where you have too many examples to cheaply include them in the prompt to make the small cheap models work out, but also still not enough for finetuning to really shine by training a larger high-end model over millions of tokens to zero-shot it.