It could be useful to look at performance of GPT-3 on foreign languages. We know roughly how long it takes humans to reach a given level at a foreign language. E.g. You might find GPT-3 is at a level on 15 different languages that would take a smart human (say) 30 months to achieve (2 months per language). Foreign languages are just a small fraction of the training data.
I think I remember seeing somewhere that LLMs learn more slowly on languages with “more complex” grammar (in the sense of their loss decreasing more slowly per the same number of tokens) but I can’t find the source right now.
It could be useful to look at performance of GPT-3 on foreign languages. We know roughly how long it takes humans to reach a given level at a foreign language. E.g. You might find GPT-3 is at a level on 15 different languages that would take a smart human (say) 30 months to achieve (2 months per language). Foreign languages are just a small fraction of the training data.
I think I remember seeing somewhere that LLMs learn more slowly on languages with “more complex” grammar (in the sense of their loss decreasing more slowly per the same number of tokens) but I can’t find the source right now.