Note that Shannon, 3 years before, had already trained the possibly first ever LLM. I could generate text such as
THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISH WRITER THAT THE CHARACTER OF THIS POINT IS THEREFORE ANOTHER METHOD FOR THE LETTERS THAT THE TIME OF WHO EVER TOLD THE PROBLEM FOR AN UNEXPECTED.
Note that Shannon, 3 years before, had already trained the possibly first ever LLM. I could generate text such as
See [_A Mathematical Theory of Communication_](https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf).
Yes, Gibson discusses that in his article.
I should note that it’s LM, not LLM.
LOL! Details. How about LMM: Little Manual Model?