Copy and pasting an entire paper/blog and asking the model to summarize it? - this isn’t hard to do, and it’s very easy to know if there is enough tokens, just run the text in any BPE tokenizer available online.
Sure, the poem prompt I mentioned using is like 3500 characters all on its own, and it had no issues repeatedly revising and printing out 4 new iterations of the poem without apparently forgetting when I used up my quota yesterday, so that convo must’ve been several thousand BPEs.
Yeah, I saw your other replies in another thread and I was able to test it myself later today and yup it’s most likely that it’s OpenAI’s new LLM. I’m just still confused why call such gpt2.
Altman made a Twitter-edit joke about ‘gpt-2 i mean gpt2’, so at this point, I think it’s just a funny troll-name related to the ‘v2 personality’ which makes it a successor to the ChatGPT ‘v1’, presumably, ‘personality’. See, it’s gptv2 geddit not gpt-2? very funny, everyone lol at troll
Copy and pasting an entire paper/blog and asking the model to summarize it? - this isn’t hard to do, and it’s very easy to know if there is enough tokens, just run the text in any BPE tokenizer available online.
Sure, the poem prompt I mentioned using is like 3500 characters all on its own, and it had no issues repeatedly revising and printing out 4 new iterations of the poem without apparently forgetting when I used up my quota yesterday, so that convo must’ve been several thousand BPEs.
Yeah, I saw your other replies in another thread and I was able to test it myself later today and yup it’s most likely that it’s OpenAI’s new LLM. I’m just still confused why call such gpt2.
Altman made a Twitter-edit joke about ‘gpt-2 i mean gpt2’, so at this point, I think it’s just a funny troll-name related to the ‘v2 personality’ which makes it a successor to the ChatGPT ‘v1’, presumably, ‘personality’. See, it’s gptv2 geddit not gpt-2? very funny, everyone lol at troll