True. Although to get the modern reader’s attention, it makes sense to have some bang in the first short chapter.
But it gets trickier after the first chapter. Perhaps one can use some iterative approach like this to circumvent that.
Additionally, the API could be more useful than the vanilla web chat, as one can define the max output length there. Especially with the large-context models.
True. Although to get the modern reader’s attention, it makes sense to have some bang in the first short chapter.
But it gets trickier after the first chapter. Perhaps one can use some iterative approach like this to circumvent that.
Additionally, the API could be more useful than the vanilla web chat, as one can define the max output length there. Especially with the large-context models.