Where it gets interesting is when you leave the space of token strings the machine has seen, but you are somewhere in the input space “in between” strings it has seen. That’s why this works at all and exhibits any intelligence.
For example if it has seen a whole bunch of patterns like “A->B”, and “C->D”, if you give input “G” it will complete with “->F”.
Where it gets interesting is when you leave the space of token strings the machine has seen, but you are somewhere in the input space “in between” strings it has seen. That’s why this works at all and exhibits any intelligence.
For example if it has seen a whole bunch of patterns like “A->B”, and “C->D”, if you give input “G” it will complete with “->F”.
Or for President ages, what if the president isn’t real? https://chat.openai.com/share/3ccdc340-ada5-4471-b114-0b936d1396ad
There are fake/fictional presidents in the training data.