Specifically, the punchline is using the repetition trap as an emotive ending bang. (Which is clever but also something that will be lost on most people who have not used large models personally, because users usually scrub or restart repetition-trap samples while tweaking the repetition penalty & other sampling parameters to minimize it as part of basic prompt engineering.)
I include some examples in my pages, papers on better sampling like the nuclear or unlikelihood papers will usually include some… A specific example which comes to mind where the repetition sorta works esthetically similar to OP is the ‘Emperor Wu’ poem GPT-2 example Scott highlights in https://slatestarcodex.com/2019/03/14/gwerns-ai-generated-poetry/
huh?
It’s a piece of fiction about someone using a funky language model tool to write autobiographical fiction.
Specifically, the punchline is using the repetition trap as an emotive ending bang. (Which is clever but also something that will be lost on most people who have not used large models personally, because users usually scrub or restart repetition-trap samples while tweaking the repetition penalty & other sampling parameters to minimize it as part of basic prompt engineering.)
Is there some place I can see an example?
I include some examples in my pages, papers on better sampling like the nuclear or unlikelihood papers will usually include some… A specific example which comes to mind where the repetition sorta works esthetically similar to OP is the ‘Emperor Wu’ poem GPT-2 example Scott highlights in https://slatestarcodex.com/2019/03/14/gwerns-ai-generated-poetry/
I would recommend just playing around with a language model. goose.ai/NovelAI are cheap and quite good if you can’t get GPT-3 access.