And if you give it specific instructions about what you want in the story, it will follow them, though not necessarily in the way you had in mind.
When you ask it for a true story, the story it returns will be true – at least in the cases I’ve checked. Now if you keep probing on one of the true stories it might start making things up, but I haven’t tried to push it.
I have found that ChatGPT responds differently to the following prompts:
Tell me a story.
Tell me a story about a hero.
Tell me a realistic story.
Tell me a true story.
And if you give it specific instructions about what you want in the story, it will follow them, though not necessarily in the way you had in mind.
When you ask it for a true story, the story it returns will be true – at least in the cases I’ve checked. Now if you keep probing on one of the true stories it might start making things up, but I haven’t tried to push it.